Meta Outsmarted Snap in Smart Glasses
The rise of Meta's Ray-Ban smart glasses is one of the most fascinating tech tales in recent times. A few years ago, if I told you that Meta has a key position in tech wearables – notably, ones that are fairly cool-looking – you would have thought I was crazy. The digital data spying company that gave us Cambridge Analytica? The social network your parents are using in Facebook? THEY WANT TO PUT CAMERAS ON YOUR FACE?! But they did it, those crazy sonofabitches did it:
The funny thing about the Meta smart glasses is nobody expected them to be as successful as they are. Partly because the first iteration, the Ray-Ban Stories, categorically flopped. Partly because they weren’t smart glasses offering up new ideas. Bose had already made stylish audio sunglasses and then shuttered the whole operation. Snap Spectacles already tried recording short videos for social, and that clearly wasn’t good enough, either. On paper, there was no compelling reason why the Ray-Ban Meta smart glasses ought to resonate with people.
And yet, they have succeeded where other AI wearables and smart glasses haven’t. Notably, beyond even Meta’s own expectations.
A lot of that boils down to Meta finally nailing style and execution. The Meta glasses come in a ton of different styles and colors compared to the Stories. You’re almost guaranteed to find something that looks snazzy on you. In this respect, Meta was savvy enough to understand that the average person doesn’t want to look like they just walked out of a sci-fi film. They want to look cool by today’s standards.
Timing is everything. I actually think Snap had the right general approach when they launched the first Spectacles eight years ago – I wrote as much then! – they were fun, colorful, and people were buzzing about them. The vending machine pop-ups were a brilliant marketing stunt. They were selling for crazy amounts of money on secondary markets. And Snap smartly positioned them as more of a "toy" rather than the future of everything. But once that initial novelty wore off, reality set in: the timing was just off. They were too early. Over the next few years, cameras and connectivity got better as did battery life. But instead of sticking with that initial vision, Snap shifted towards a more, well, sci-fi looking version of Spectacles with AR as the focus. (Google may be following suit.) And so they went from being too early in camera-focused smart glasses, to being too early in AR-focused smart glasses.
Cut to Meta, ever the copy-cat. They pick up the ball that Snap dropped. And, as noted above, even though their first collaboration with Ray-Ban was a flop, they stuck with it (I was wrong). They made improvements and used the glasses as almost a foil to their completely nerdy and expensive Quest VR headsets.
At $299, they’re expensive but are affordable compared to a $3,500 Vision Pro or a $699 Humane pin. Audio quality is good. Call quality is surprisingly excellent thanks to a well-positioned mic in the nose bridge. Unlike the Stories or Snap’s earlier Spectacles, video and photo quality is good enough to post to Instagram without feeling embarrassed — especially in the era of content creators, where POV-style Instagram Reels and TikToks do numbers.
I feel like I hear this a lot from folks who have a pair – including, "regular" (non-tech) people. It's not the access to AI functionality or some such that makes them good, they don't use such things. It's that they're really good at quickly taking pictures on the go. And more importantly, they're great induction speakers with a good microphone to take calls. And again, they don't look ridiculous to wear in public. You know, the little things.
But again, timing is everything. In walks the aforementioned AI...
This is a device that can easily slot into people’s lives now. There’s no future software update to wait for. It’s not a solution looking for a problem to solve. And this, more than anything else, is exactly why the Ray-Bans have a shot at successfully figuring out AI.
That’s because AI is already on it — it’s just a feature, not the whole schtick. You can use it to identify objects you come across or tell you more about a landmark. You can ask Meta AI to write dubious captions for your Instagram post or translate a menu. You can video call a friend, and they’ll be able to see what you see. All of these use cases make sense for the device and how you’d use it.
Because they launched before the AI revolution, Meta sort of stumbled into realizing that AI is a feature, not the entire product. When they try to do updates to make the product focus more on AI, well...
In practice, these features are a bit wonky and inelegant. Meta AI has yet to write me a good Instagram caption and often it can’t hear me well in loud environments. But unlike the Rabbit R1, it works. Unlike Humane, it doesn’t overheat, and there’s no latency because it uses your phone for processing. Crucially, unlike either of these devices, if the AI shits the bed, it can still do other things very well.
I'm very curious to see what Meta announces at their Connect conference later today. Newfangled AR glasses to counter what Snap is doing have been rumored – "Orion"? Project Puffin? – but it's not clear they really need to counter such a product, at least not yet. Meta hasn't been shy from talking up whatever they're about to showcase in this space, but it's also undoubtedly too early. To be fair, neither company intends to sell such wares to consumers right now – unlike, say, Apple – but still... Perhaps the focus should be on the space that Snap overshot – snazzy camera sunglasses. The perfect vessel for real-world AI, soon.