Apple Secures Their AI Security Angle

Using the 'Secure Enclave' to do AI in the cloud makes sense...
Apple’s Plan to Protect Privacy With AI
Putting Cloud Data in a Black Box...

Well, this appears to be the answer for how Apple will still be able to tout security while still using the cloud for some of their forthcoming AI work:

Apple plans to process data from AI applications in a virtual black box, making it impossible for its employees to access it, according to four former Apple employees who worked on the project. Over the past three years, the company has been working on a secret project, known internally as Apple Chips in Data Centers or ACDC, that would allow for such black box processing. Its approach is similar in concept to confidential computing, an industry term that means the data is kept private even while it’s being processed.

As I wrote just a few days ago:

Presumably, Apple will be touting on-device AI as perhaps the key to their efforts, only made possible by Apple Silicon, naturally. That will undoubtedly lead to talk about how the largest AI workloads still need to be done in the cloud, and so Apple has built their own, secure data centers again running Apple Silicon to do this work. Privacy. Privacy. Privacy.

Yep. And last week in envisioning a theoretical Apple AI feature being announced by Craig Federighi:

"Remember Time Machine? Many of you still use it to back-up your most important information and memories. Well now we're bringing it back to the future, powered by Apple AI. Time Machine is no longer just for backing-up, it's now for real time recall of anything you've done on your Mac. All fully run on your device where it's stored safely and completely encrypted.

And that's not all. We're bringing it to your iPhone and iPad too. So you can instantly find anything you've done on any device with a simple semantic search. All still stored locally and encrypted.

But if you need a way to search across your devices, we've come up with a secure way to do that in iCloud as well..."

If the above report proves to be accurate, pretty much nailed it (we'll see about the new Time Machine feature and Federighi's dad joke). But really, it's easy to nail because of course Apple needed a way to do at least some of their AI work in the cloud while still being able to play-up the privacy angle, which will be one of their key vectors of attack versus their rivals here. The other related angle? Apple Silicon – back to Wayne Ma:

Apple’s confidential computing techniques utilize the high-end custom chips it originally designed for Macs, which offer better security than competing chips made by Intel and AMD, the people said. By using its own chips, Apple controls both the hardware and software on its servers, giving it the unique advantage of being able to design more secure systems over its competitors. In doing so, Apple could make the claim that processing personal data on its servers is just as private as on the iPhone, they said.

This is also in-line with the reports of Apple using their own M2 chips to power any AI work in their data centers. And if it works, it could speak to some larger ambitions Apple has in the cloud:

Such a move could open the floodgates to new AI features for iPhones powered by large language models in the cloud. Two of the former Apple employees who worked on the project said there are greater ambitions to offload the processing power of future Apple wearable devices to servers, allowing Apple to design more lightweight headsets and glasses without the need to accommodate large batteries and cooling fans for power-hungry processors.

There will be an interesting balancing act in touting the AI work that can be done on-device (or "the edge" to use Microsoft's preferred parlance) versus on-server. Certainly speed will be in favor of on-device. While overall compute (and the related elements, such as battery life) will be in favor of on-server. But whereas security would have been a big one for on-device as well (and again, something Microsoft has been touting with their Copilot+ PCs), Apple's solution may mute that somewhat, if not totally negate it. And if that works, again, it could help Apple shift more workloads to the cloud.

Anyway, a bit more on how Apple would clearly message this "secure cloud":

Key to the effort is an Apple-developed hardware technology known as the Secure Enclave, which was first released in 2013 to store the biometric data captured by the fingerprint sensor in the iPhone 5S. The Secure Enclave is an area of the chip that is physically separate from its main processor and acts like a storage locker for sensitive data such as passwords and cryptographic keys. Its design guarantees that even if hackers were to compromise an iPhone’s software and central processor, they still wouldn’t be able to access the data in the Secure Enclave.

Apple plans to use the Secure Enclave to help isolate the data being processed on its servers so that it can’t be seen by the wider system or Apple, these people say.

Users have grown to trust the "Secure Enclave" on their Apple devices, so using this branding for the cloud security makes sense, even if it's a bit different.

More on Apple's AI ambitions...
Ready or Not, Here Apple AI Comes
Will slow but not fully steady win the race?
Apple’s Rightfully Boring AI Play
Apple is playing catch up, but doesn’t need to sprint