Apple Set to Show Some AI Skin, But Not the Full Monty

Interesting timing on this news, given that it hit the wire almost exactly one hour before Google I/O is set to kick off. OpenAI and Microsoft already played their hands, so now we have the trifecta:
Apple Inc. is preparing to allow third-party developers to write software using its artificial intelligence models, aiming to spur the creation of new applications and make its devices more enticing.
The iPhone maker is working on a software development kit and related frameworks that will let outsiders build AI features based on the large language models that the company uses for Apple Intelligence, according to people with knowledge of the matter. Apple expects to unveil the plan on June 9 at its Worldwide Developers Conference, said the people, who asked not to be identified because the initiative hasn’t been announced.
To state the obvious potential problem here: does Apple have any models that developers wish to use? If this sounds harsh to you, perhaps you haven't been following the complete and utter shitshow that has been Apple's AI "roll out" over the past year? So either, post-shake-up, Apple is suddenly feeling a lot more confident in their capabilities or...
Apple Intelligence already powers iOS and macOS features such as notification summaries, text editing and basic image creation. The new approach would let developers integrate the underlying technology into specific features or across their full apps. To start, Apple will open up its smaller models that run on its devices, rather than the more powerful cloud-based AI models that require servers.
That's seemingly the key here. This will start with the smaller, on-device models. And that's smart. Those models could be compelling to developers because they'll be able to run locally on the iPhone (or iPad or Mac) and thus, much faster than any model in the cloud. But they'll also undoubtedly be far more limited than any larger "flagship" LLM. And that's presumably where Apple is going to be the weakest. Not just because they've been slow to embrace the AI revolution – other late entrants, such as Elon Musk's xAI have been able to throw money at the problem to catch up quickly, and Apple has far more money to do so – but because their long-standing policies on things like privacy put them at a natural data disadvantage, for example. Yes, it's a dual-edged sword.
With that in mind, presumably they'll go back to talking up "Private Cloud Compute" at WWDC and noting that this too will be opened up for developers to use Apple's larger models eventually. And this on-device-first approach may buy Apple more time to get their cloud models up to speed, assuming they've truly found religion with regard to AI and how best to build around it.
Still, the question remains: will developers want to use Apple's models versus continuing to use the leading models from OpenAI or Anthropic or Google or others? If they opened it up tomorrow, presumably not. But we'll see how it plays out over the next year, I guess.
Apple is hoping to replicate the early success of the App Store, which occurred after the company opened up its in-house technologies and software frameworks to developers for the first time. By offering its models and making them simple to integrate, Apple is poised to turn its operating systems into the largest software platforms for AI.
It's a fine idea, and good framing on paper – presumably we'll get a name like 'AI Kit' or 'Core AI' – but it's also not really reality. Opening up development of native apps on the iPhone feels like a pretty different thing at this particular moment in time for the reasons mentioned above. Apple will have to go a pretty long way to earn such trust that this can be a similar moment.
One more thing: what if Apple opted to let other company's models run on their servers (which are often rented on other provider's clouds too, FWIW) and let developers access them all through Private Cloud Compute? Now that could be interesting.


