AI is the Feature, Apple Makes the Products

But also: what if Apple *does* need a ChatGPT-like product?
WWDC 2024: Apple Intelligence
Apple is focusing on what it can do that no one else can on Apple devices, and not really even trying to compete against ChatGPT *et al.* for world-knowledge context. They’re focusing on unique differentiation, and eschewing commoditization.

John Gruber reflecting on the 'Apple Intelligence' announcements, recalls the story of how Steve Jobs told the founders of Dropbox that their company was "a feature, not a product":

Leading up to WWDC last week, I’d been thinking that this same description applies, in spades, to LLM generative AI. Fantastically useful, downright amazing at times, but features. Not products. Or at least not broadly universal products. Chatbots are products, of course. People pay for access to the best of them, or for extended use of them. But people pay for Dropbox too.

Chatbots can be useful. There are people doing amazing work through them. But they’re akin to the terminal and command-line tools. Most people just don’t think like that.

Also leading up to WWDC, this was my general (obvious) guess to how Apple would play this:

The key, as always with Apple, will be how well they’re able to seamlessly bake the AI into their products. With many other services right now, AI feels tacked-on at best, and unnaturally shoved in a user’s face at worst. To me, one of the key reasons that OpenAI has been a leader in the field beyond some first-mover advantages, is that they’re the best at actually productizing the AI innovations. As we’ve learned time and time again throughout the history of technology, it’s not good enough to have the best technology, you need to make it sing. You need to wrap a compelling product around the technology to get people to actually use it. To get people to actually *want* to use it.

And so for something like notification summaries, it can’t just be a data dump of items an AI deems to be important across a range of apps, it needs to be an elegant solution to the problem of notification overload. A feature that both truly saves users time and is presented in a way that is desirable to read and use.

Again, this is what Apple does. They look at some early takes on general ideas and figure out a way to make them work for the masses. And I suspect that AI will be no different.

Again, this was all right there before the announcement. Apple did what they always do. But it was easy to go down rabbit holes of wondering how Apple might try to outgun the competition in AI. Especially because AI has been an absolute arms race amongst Apple's competitors. But Apple's focus, instead, was on outsmarting the others – by doing things they can't possibly do. Back to Gruber:

Instead Apple is doing what no one else can do: integrating generative AI into the frameworks in iOS and MacOS used by developers to create native apps. Apps built on the system APIs and frameworks will gain generative AI features for free, both in the sense that the features come automatically when the app is running on a device that meets the minimum specs to qualify for Apple Intelligences, and in the sense that Apple’s isn’t charging developers or users to utilize these features.

And:

We had a lot of questions about Apple’s generative AI strategy heading into WWDC. Now that we have the answers, it all looks very obvious, and mostly straightforward. First, their models are almost entirely based on personal context, by way of an on-device semantic index. In broad strokes, this on-device semantic index can be thought of as a next-generation Spotlight. Apple is focusing on what it can do that no one else can on Apple devices, and not really even trying to compete against ChatGPT et al. for world-knowledge context. They’re focusing on unique differentiation, and eschewing commoditization.

The one thing I still wonder/think about post-WWDC is if Apple does still have a go-forward plan on the chance that the chatbot paradigm isn't just a fad. I'm using "chatbot" as a catch-all here – I also believe chatbots are a feature – but it's more about the overall interaction paradigm. That is, what if ChatGPT has taught a new generation that the best way to interact with AI is to ask "it" something (be it via text, voice, images, etc) and get something back – not just from your content, but from beyond. Apple doesn't currently have a way to do that – aside from the ChatGPT fallback. What if Apple needs to answer this call still?

Logically, this would be through Siri – assuming they can get Siri up to snuff, which is still a big assumption. And maybe we get to the point where "she" merges with Spotlight? And then Apple could just swap-out ChatGPT/Gemini/Claude/etc with their own LLM output. But the brilliance of their model here is that they can do this slowly and subtly over time. This doesn't have to be a rip-out Google Maps and shove a wonky Apple Maps in users' faces situation. The query routing system Apple has built should allow them to just slot in their own results as they're confident in them.

Or perhaps they do decide that this is just like the web search paradigm. In that case, they strike a deal with one of these players to be the default chatbot in exchange for a handsome fee and/or revenue split.

Right now, I could see it going either way. But again, Apple has given themselves maximum optionality with the Apple Intelligence system they've built.

One more thing: Gruber concludes:

If generative AI weren’t seen as essential—both in terms of consumer marketing and investor confidence—I think much, if not most, of what Apple unveiled in “Apple Intelligence” wouldn’t even have been announced until next year’s WWDC, not last week’s WWDC. Again, none of the features in “Apple Intelligence” are even available in beta yet, and I think all or most of them will be available only under a “beta” label until next year.

It’s good to see Apple hustling, though. I continue to believe it’s incorrect to see Apple as “behind”, overall, on generative AI. But clearly they are feeling tremendous competitive pressure on this front, which is good for them, and great for us.

Yes, agreed. This is all a bit too vaporware-y, with some features perhaps a year out. But Apple clearly felt the need to answer the AI call. The risk, of course, is that they're still too early, not too late.