Cameras In Your Ears

The 'AirPods with Cameras' rumors start to make sense...
New IR Camera-Equipped AirPods
To Enhance User-Device Interaction and Strengthen Apple’s Spatial Audio/Computing Ecosystem

I had been slightly confused by the rumors of "AirPods with cameras" being in Apple's pipeline of products. Everyone's assumption seemed to be that these could be their answer to products such as Meta's Ray-Ban smart sunglasses – that is, AI-enabled wearables with cameras to help enhance the experience. But putting a camera in your ears or even in the "stem" next to your jaw seems weird to me. Could such a hybrid device possibly take a good picture?

As I wrote back in February based on some early reporting from Mark Gurman on the matter:

As for the "AirPods with cameras idea"? Please don't do that, Apple.

But this report from Ming-Chi Kuo would seemingly make a lot more sense. What if the "camera" sensor isn't meant to take pictures, but instead to gauge your environment to help augment (literally) other aspects of computing.

The new AirPods is expected to be used with Vision Pro and future Apple headsets to enhance the user experience of spatial audio and strengthen the spatial computing ecosystem. For example, when a user is watching a video with Vision Pro and wearing this new AirPods, if users turn their heads to look in a specific direction, the sound source in that direction can be emphasized to enhance the spatial audio/computing experience.

Often lost in any talk about AR/VR/XR is the importance of sound. This is obviously because the biggest element of these technologies revolve around sight. But what you hear in such environments is a crucial part of the immersion. That's especially true if you're trying to "trick" your brain into thinking what you're seeing is real. Said another way, "Spatial Computing" isn't just about sight but sound and likely one day touch.1

And such cameras may have another trick too:

The IR camera can detect environmental image changes, potentially enabling in-air gesture control to enhance human-device interaction. It is worth noting that Apple has filed related patents in this area.

If Apple could offload (or enhance) some of the sensors on the Vision Pro, it certainly can't hurt that device which they're clearly desperately trying to shrink (both in size and in cost).2

Anyway, Kuo has a mixed track-record with his reports of late, but he obviously has strong supply chain sources and the components needed for this "camera" would seem to make more sense than, say, a point-and-shoot from your ears.


1 Ideally not smell. But who knows, from Smell-O-Vision on forward, those crazy scientists keep trying...

2 You know what else could help with all this? A ring. That remains the new piece of hardware "wearable" that I'd be most compelled by Apple working on. They undoubtedly think they can do most of it with the Apple Watch already, but there's potentially a different product and market here...