Apple's Rightfully Boring AI Play
Just two weeks away from the WWDC keynote, Mark Gurman would seem to have the high-level goods on what Apple will be talking about with regard to AI:
Apple is preparing to spend a good portion of its Worldwide Developers Conference laying out its AI-related features. At the heart of the new strategy is Project Greymatter — a set of AI tools that the company will integrate into core apps like Safari, Photos and Notes. The push also includes operating system features such as enhanced notifications.
The system will work as follows: Much of the processing for less computing-intensive AI features will run entirely on the device. But if a feature requires more horsepower, the work will be pushed to the cloud.
Apple is bringing the new AI features to iOS 18 and macOS 15 — and both operating systems will include software that determines whether a task should be handled on the device or via the cloud. Most of the on-device features will be supported by iPhone, iPad and Mac chips released in the last year or so. The cloud component, meanwhile, will be powered by M2 Ultra chips located in data centers, as I’ve previously reported.
The "Project Greymatter" name is new, but the rest of this is what has been expected from a mixture of previous reporting, open source papers on Apple's work in AI, and just plain old common sense (about Apple, at least). The hybrid on-device/on-cloud processing will all come down to how well it's implemented from a product perspective. Can Apple make such routing invisible to end users? And do they even want to, versus letting users know if their data is being transmitted to the cloud for security purposes? Also, the speed element – since obviously anything done on-device will be much faster than work being done in the cloud.
Presumably, Apple will be touting on-device AI as perhaps the key to their efforts, only made possible by Apple Silicon, naturally. That will undoubtedly lead to talk about how the largest AI workloads still need to be done in the cloud, and so Apple has built their own, secure data centers again running Apple Silicon to do this work. Privacy. Privacy. Privacy.
But Apple also has to recognize the current state of play and that means consumer demand for AI chatbots. We could in fact already be nearing the end of that craze as these bots evolve into agents and text input goes fully multi-modal. And so partnering with a player like OpenAI (and perhaps others) makes a lot of sense to "off-load" this element of AI. The key once again will be messaging this from a privacy and brand perspective.
There are several new capabilities in the works for this year, including ones that transcribe voice memos, retouch photos with AI, and make searches faster and more reliable in the Spotlight feature. They also will improve Safari web search and automatically suggest replies to emails and text messages.
The Siri personal assistant will get an upgrade as well, with more natural-sounding interactions based on Apple’s own large language models — a core technology behind generative AI. There’s also a more advanced Siri coming to the Apple Watch for on-the-go tasks. Developer tools, including Xcode, are getting AI enhancements too.
This is all fairly table-stakes AI stuff now. Again, the key here will be how Apple is able to productize all of the above to make it actually useful (and work well, hopefully). But this makes sense, start small and "boring".
One standout feature will bring generative AI to emojis. The company is developing software that can create custom emojis on the fly, based on what users are texting. That means you’ll suddenly have an all-new emoji for any occasion, beyond the catalog of options that Apple currently offers on the iPhone and other devices.
Also pretty standard for Apple to bake some element of "fun" into whatever they roll out. The problem with calling them "generative emojis" or whatnot is that they presumably won't work with the actual standardized emoji set, which is a large cross-company, cross-device effort, of course. So they'll undoubtedly be more like Apple's current "fun" expressive emojis that can track you facial expressions. But if done well, this could be a new iPhone differentiator – i.e. if AI emojis only work within iMessage, will it entice users in a similar manner to blue vs. green bubbles? Or will Apple "downgrade" them to simple images to send to others who are not using iOS devices? Will third-party chat apps, such as WhatsApp, get access?
A big part of the effort is creating smart recaps. The technology will be able to provide users with summaries of their missed notifications and individual text messages, as well as of web pages, news articles, documents, notes and other forms of media.
Again, this all just boils down to how well this is actually done and implemented. Will they borrow Arc's great pinch-to-summarize gesture?
Now, many of these features will be purely catch-up. There’s no leapfrogging here. Google has had many of the same AI features in its Pixel devices for several years. Samsung Electronics Co. rightfully threw in the towel on developing its own marquee AI features this year and relies on Google Gemini instead.
Sure, but this comes to the question of if Apple is actually behind here? Certainly Wall Street thinks so, which is reflected in the stock price relative to their peers. And while Apple and many Apple watchers will be quick to note how this doesn't matter to the company, it certainly does because as one of the most widely-held (maybe the most widely held?) public stock in the world, this is a way Apple is now judged, for better or (mostly) worse.
Anyway, I think you can certainly make a case that even now, Apple isn't as far behind on AI as the headlines would have you believe. And that's simply because the state of AI continues to shift and change so rapidly. What was cutting edge a month ago is now old hat. And worse, companies are getting sidetracked chasing the new "hotness", at least in part to look ahead in AI – but it's unclear the actual usefulness or trustworthiness of much of this stuff.
It's all so chaotic right now. And so I think the longer-term picture could paint Apple in a good light if they can wait to implement AI that will actually prove to be useful. But again, the short-term picture could be rough from a Wall Street perspective. Which is ultimately why WWDC has to be "AI, AI, AI, AI, AI."
The big question is whether it really matters that Apple is playing catch-up here. The company has one advantage that few rivals can match: its massive base of users.
There will be hundreds of millions of Apple devices around the world that can support the AI features when they debut later this year. The owners of those devices will probably at least try out the new capabilities (the technology may be integrated tightly enough that people won’t even notice they’re using them). That could turn Apple into the biggest AI player overnight.
Right. This is the other key of the above equation. It's not just Apple's scale – Meta has arguably a more impressive scale when it comes to user reach – it's that they have truly integrated scale. Meaning, Apple has the unique capabilities to roll out features and functionality on the hardware and software side to over a billion devices that they fully control. This perhaps will matter more for AI than it has for anything else to date.
Not stated in this Gurman report: if some of the AI features will be limited to certain devices based on hardware capabilities, but previous reports have indicated as much, which would bring the scale down from billions of devices to "merely" millions. But again, if this is all rolling out as a sort of beta test for Apple, this would seemingly make sense.
And remember too that the other major argument for launching at least some major AI capabilities now is that Apple can use their real-world device scale to help train their AI in ways that their rivals will struggle to match. Again, thanks to their control of the most advanced hardware with its myriad sensors.
One more thing:
Another fun improvement (unrelated to AI) will be the revamped iPhone home screen. That will let users change the color of app icons and put them wherever they want. For instance, you can make all your social icons blue or finance-related ones green — and they won’t need to be placed in the standard grid that has existed since day one in 2007.
Again, there have been previous reports of the changing of the grid, but the color element seems new (and would match what Android has been doing for some time). A new coat of iOS paint is long overdue, so this would all be a legit "finally" – I just wonder if iPadOS will get any of this on day one? Or if that OS will continue to be the under-loved and under-powered sibling.
The Inner Ring: Sign Up Here