M.G. Siegler •

OpenAI's Local AI Device Dreams

What if their would-be AI device doesn't require a connection...
OpenAI's Local AI Device Dreams

While most of the talk coming out of Brad Gerstner's sit down with Satya Nadella and Sam Altman was around a snap-back comment the latter made when Gerstner dared question how OpenAI might be able to pay for their trillion-dollar-plus in build-out commitments (something Alex Kantrowitz and I discussed at length in his latest Big Technology Podcast), another comment, made almost in passing, caught my eyes and ears. As Jowi Morales relays for Tom's Hardware:

Aside from the lack of power, they also discussed the possibility of more advanced consumer hardware hitting the market. “Someday, we will make a[n] incredible consumer device that can run a GPT-5 or GPT-6-capable model completely locally at a low power draw — and this is like so hard to wrap my head around,” Altman said. Gerstner then commented, “That will be incredible, and that’s the type of thing that scares some of the people who are building, obviously, these large, centralized compute stacks.”

In trying to think through the type of device OpenAI might be building alongside Jony Ive's io team which they acquired, I hadn't really thought about the notion that such a device might be aiming to run AI locally.

Certainly Altman's "someday" qualifier may mean that it's the ultimate goal and not the initial goal for their hardware. But the fact that he calls out GPT-5 and GPT-6 suggests perhaps that "someday" isn't that far away.

Imagine a device – a small, screen-less digital companion for you life – that is always on, always recording, always processing because it's running AI locally, not in the cloud. This seemingly would help with speed, privacy, and even battery life. Granted, you would still need the cloud/connection for certain things, such as up-to-date information retrieval, but perhaps the device could "fall back" to your smartphone connection for those use cases – especially since, without a screen on this device, OpenAI envisions that you're undoubtedly going to still have your smartphone with you. Perhaps they just hope their device allows you to use it less.

And this would seemingly answer a question I've had about the device from the get-go – because the connection element is something that trips up many consumer devices for different reasons. As I wrote in May:

Of course, none of that answers how such a device will connect to the cloud which will clearly be required here. A WiFi device seemingly doesn't make sense since it would be a huge pain to set up over and over again if you did take it with you (perhaps especially without a screen!). Perhaps it tethers to your phone, but there will be trade-offs there. Or maybe it comes with its own connection – the old, original Amazon Kindle approach. With the right partners, this could work, though again, whose paying for that bandwidth?

This may seem like a detail but it will be of vital importance if the goal is truly to make a device that is "inevitable" as Ive so often likes to say.

In the video, they're quick to note that it won't replace the smartphone any time soon, but clearly the intent here is to get people to use them less often. To do that, they will have to fill the moments in time that a smartphone fills for billions right now. Waiting in line. A lull in a conversation. The desire to look something up, only to get distracted down a rabbit hole of notifications.

And a month ago, I followed up with:

If the team (rationally) assumes that everyone would still have their phone with them alongside this new device – and certainly not including a screen on this device will necessitate that – the easiest thing would be to tether to the phone, obviously. But there are also issues there when you don't fully control the phone, just ask Mark Zuckerberg for his thoughts on that topic...

A localized AI could solve a bunch of these issues...

Per Altman's comment, a model running locally on-device could end up being profound in the way it shifts user behavior and the relationship with the device. Again, this isn't meant as a AI service hoovering up information about you on their servers, but rather your own personal AI companion – certainly, you could see that being the way OpenAI would want to frame this product and use such a narrative to sell it. Quite literally!

Ironically, this is something I've been going back to in thinking about Apple's own positioning in the AI market. Because they control the hardware on which these services all run, if they too can start building devices that are equipped for regular people to run AI locally, that's interesting for the same reasons as OpenAI's device. It's nuanced and vague, but there are probably going to be new and interesting ways people learn to use AI when it operates in an always-on and extremely fast way with privacy as a key feature.

Granted, we're not there yet with the right combination of models that can run locally and this type of open-ended functionality – at least not without some major, and majorly expensive hardware (and access to the right LLM code!). But it's undoubtedly just a matter of time. And you can probably bet that OpenAI, building hardware overseen by Jony Ive, wants to beat Apple to that particular punch.

👇
Previously, on Spyglass...
The Anti iPhone
Jony Ive’s antidote to the smartphone obsession he helped usher in…
OpenAI’s Digital Assistant Device
Obvious challenges aside, it’s starting to come into focus…
An AI Device with LoveFrom OpenAI
Or: ‘When Jony Met Sam’ -- Also: ‘A San Francisco LoveLetter’
“We’ve gone sideways.”
Laurene Powell Jobs on OpenAI/IO’s iPhone antidote…
OpenAI & IO
Sam Altman & Jony Ive aim to take the AI race into the real world…