• Lowpass
  • Posts
  • Our weird wearable AI future

Our weird wearable AI future

Look at that

Welcome to Lowpass! This week: Testing multimodal AI on Meta’s Ray-Ban smart glasses, and Roku’s home screen dilemma.

The future of wearable AI is going to be weird

I’ve spent some time in recent weeks testing Meta’s new Ray-Ban smart glasses, with a specific focus on a brand-new feature: After first announcing it at last year’s Connect conference, Meta recently added multimodal AI functionality to the glasses. In a nutshell, this allows Ray-Ban owners to ask the AI about anything in their field of view, which results in the glasses taking a picture, uploading it to the cloud and then having the AI identify objects and answer questions about them.

The multimodal AI feature is part of an early access program. Some of the most promising functionality teased at Connect isn’t available yet, including the ability to ask about landmarks. Some of it takes a bit getting used to. And some answers are just plain wrong – a possibility Meta acknowledged in a recent blog post.

However, using the glasses also gave me insights into the potential of wearable AI, as well as more fundamental challenges with this type of technology. It also provided for a bit of unintentional comedy, and more than one jaw-dropping moment.

I did not, for instance, expect the AI to cheerfully suggest that I should get rid of my dog.

(...)

Subscribe to Premium to read the rest.

Become a paying subscriber of Premium to get access to this post and other subscriber-only content.

Already a paying subscriber? Sign In.

A subscription gets you:

  • • A full-length newsletter every week
  • • No ads or sponsorship messages
  • • Access to every story on Lowpass.cc
  • • Access to a subscriber-only Slack space and subscriber-only events

Reply

or to participate.