More Like 'Shudder Button'

Apple's path to train AI in the real world at scale...
iPhone 16 Pro and 16 Pro Max hands-on: don’t call it a shutter button
Bigger than ever.

By far the most compelling element of the iPhones 16 this year is the new "Shutter Button" – sorry, "Camera Control" button. Everyone knew it was coming thanks to supply chain leaks months ago, but it's a bit more interesting than anticipated.

First, on the hardware:

The reason it’s not *just* a shutter button is that it’s also a multifunctional capacitive control surface. The physical button itself is ultrasensitive, so pressing it ever so lightly brings up swipe-to-zoom controls, and double-pressing it lightly brings up additional controls you can swipe between, like lens selection, exposure, and the new photo styles available on the Pro. It took me a second to determine how hard to press, but it wasn’t hard to figure out. Apple says that as part of a software update later this year, the button will get a two-stage shutter function that will allow you to lock focus and exposure.

It was pretty seamless to switch between the various photo styles with swipes, but it was hard to see how much they were actually doing in the perfect lighting conditions of Apple’s demo area. But I am very curious about them.

The capacitive control surface was also rumored, but it looks pretty slick in the demos Apple gave showed on a pre-recorded video yesterday. Per The Verge's hands-on, it seems like it will take a bit of getting used to, but it also seems decidedly better than doing this all via overlays in the software.

I ran into Apple’s Phil Schiller, and we chatted briefly about the Camera Control button. I wanted to know about the balance of using the button as a classic camera control versus the beginning of the camera itself becoming an input method for Apple Intelligence, and he told me that it was really both, which is fascinating.

The "Visual Intelligence" element of the event was arguably the biggest surprise – which is not surprising given that it's software functionality and thus, more impervious to leaks. And while Apple made us wait to learn anything about it, when they did actually talk about it during the event, it was one of the few elements of the event that was legitimately compelling. Yes, it's Apple's answer to Google Lens – something I've been beyond intrigued by for a long time: I wrote this post seven years ago – but it's arguably more important than that in our current AI reality. It seems to point to something I've been wondering how/if Apple would take advantage of...

As I wrote back in January – in one of the very first Spyglass posts! – when Apple's AI functionality was all still very much rumored:

I mean, plenty of these LLMs now have capabilities to, say, upload images or videos. But what Apple could do here is potentially different – again, because they have direct access to billions of devices, the majority of which have insanely great cameras built into them. This is part of what excited people about the Rabbit "R1" device that seemed to steal the show at CES this year. It wasn't just another software-focused AI tool, it was a tangible device for using AI in the real world.

But unlike a newly created and shipping soon product (just like the forthcoming Humane "Ai Pin"), imagine if you had a device already in a billion pockets around the world. A device that is massively more powerful than anything a startup could possibly produce. Not to mention the planet's best channels to move new devices...

Apple is getting dinged for being "late" to the AI revolution. But given how fast things are moving and how much has yet to be determined here, if anything, anything they do with AI soon may still be too early. Things should get decidedly more interesting when they can ferret out what's being promised with "Ferret". But that will undoubtedly take time (and research), which is why Apple needs to start shipping AI stuff now. The war is for talent more so than products at the moment. That's the real risk to Apple.

Apple (and Google) have a unique advantage here thanks to their smartphones. And while it's early days, the potential to leverage such devices to help train their AI models in the real world simply must scare the shit out of the Metas and OpenAIs of the world – which is why they're all pushing so hard into hardware, or rumored to be working on any number of physical things.

But even the early success of Meta's smart glasses collaboration with Ray-Ban can't possibly hope to reach the scale of a smartphone. Not even Google's Pixel devices, which remain a relatively small player in the Android ecosystem, let alone the iPhone. I suspect AI pins and other thingseven those made by Apple – aren't ever going to get there either. So yeah, this is a pretty big deal for Apple and I suspect we'll hear a lot more about it in the coming years as we morph from LLMs to training AI in and with the real world around us.

As I followed up with regarding the latest Apple AI rumors in May:

This is the other key of the above equation. It's not just Apple's scale – Meta has arguably a more impressive scale when it comes to user reach – it's that they have truly integrated scale. Meaning, Apple has the unique capabilities to roll out features and functionality on the hardware and software side to over a billion devices that they fully control. This perhaps will matter more for AI than it has for anything else to date.

Not stated in this Gurman report: if some of the AI features will be limited to certain devices based on hardware capabilities, but previous reports have indicated as much, which would bring the scale down from billions of devices to "merely" millions. But again, if this is all rolling out as a sort of beta test for Apple, this would seemingly make sense.

And remember too that the other major argument for launching at least some major AI capabilities now is that Apple can use their real-world device scale to help train their AI in ways that their rivals will struggle to match. Again, thanks to their control of the most advanced hardware with its myriad sensors.

This is all starting to play out exactly as envisioned...

One more thing: wouldn't it have been nice for Apple to include Touch ID in the Camera Control button? I'm sure there are technical reasons why they couldn't, but it looks similar enough to what they did with the iPad mini one...

🤖
More on this real world AI push...
Lo! The AiPhone Nears...
Apple Boosts Plans to Bring AI to iPhones Apple is quietly increasing its capabilities in artificial intelligence... Financial Times Michael Acton On one hand, all of this feels fairly obvious. Yes, Apple is planning to have AI be a part of the iPhone going forward – just like every company currently
AI Agents Assemble!
OpenAI aims to usher out the Chatbot Era…
Apple’s Rightfully Boring AI Play
Apple is playing catch up, but doesn’t need to sprint
Apple’s AI-Driven ‘FacePod’ Thing
A perhaps risky project could be Apple’s first real robotics step?