The Voice Assistant Who Cried Wolf
My initial take after yesterday's WWDC keynote was that Apple was doing AI their way – including in refusing to call it "AI" but rather "Apple Intelligence". They took the technology, stripped out the hype, and distilled it down to their products in ways that seemed both tangible and useful. Compared to other AI events, it's all rather cautious. But that's exactly what it needs to be in order to garner mass usage. While others dance to the promise of technology that gives good demos, Apple waltzes in and productizes technology that can be useful today.
This level of restraint and practicality engenders trust in their products. And Apple's focus on privacy and security just doubles down on that trust. And that has arguably never been more important than it is right now with the tentacles of AI reaching out to touch everything in computing. That's the playbook. Rinse. Repeat.
Except wait. Hold on a minute. Back to the sink.
Watching the keynote again today I couldn't help but hear this voice in my head. A voice that sounded a lot like Siri. A voice telling me something completely useless in response to a simple query. Such has been the state of Apple's AI – OG AI, sure, but still AI – for over a decade now. And despite promise after promise that with this next update, everything would be better, Siri has broken our trust. Over and over and over again.
So you'll forgive me if I pause at the notion that yes, this time it will be different. It's certainly possible that the LLM revolution will change the equation for Siri. And/or that ChatGPT will fully fill in the many Siri holes. But I watch these demos and I'm still just beyond skeptical that it will all work as well as it seems to. That's true of basically all demos, of course. But why are we trusting Apple this time?
And that's a problem because Siri remains at the heart of what Apple is selling with the new intelligent future. I don't care how nice her new coat of UI paint looks, if we ask her about our mom's flight and she can't figure it out, or spits back bogus information, well, we're back to where we are right now. With Larry David screaming obscenities at his phone.
I would love for all this to work as advertised. But I also would have loved for it to work as advertised in 2011. And 2012. And 2013. And 2014. And 2015. And 2016. And 2017. And 2018. And 2019. And 2020. And 2021. And 2022. And 2023.
Apple can tout Siri processing 1.5B queries a day til their blue in the face, the question is how many of those responses lead users to be blue in the face? Given how mainstream the jokes are, and the truckloads of anecdotal data we all have, I would guess the number is not insignificant.
In a weird way, Apple only has themselves to blame. Siri is one of the few times where they were legitimately early (at least in a scaled way) to market with a new type of product. Steve Jobs had a vision of what the iPhone could be – and really, Apple itself had a vision of what the future of computing could be dating to the Jobs-in-exile years with Knowledge Navigator. They were simply too early.
It may not have seemed that way when Amazon's Alexa exploded onto the scene and seemed to blow Siri out of the water. But ultimately, that product also was too early. We're being told that now is the time, this time. And it may very well be, but we're going to need to see it. To hear it. Not in a demo, in our lives.
And the weird thing there is that we're not going to be able to – at least not fully – for quite some time. Apple is promising the full range of features touted yesterday not at some single later date, but instead over the course of the next year.
All that indicates to me is that Apple is once again early. While everyone was worried that the company would be late to the AI revolution, I fear that they're still too early. And that Siri, once again, won't be able to meet the moment. And fulfill the promise. I hope I'm wrong this time, but this is the one area where we can't really trust Apple, sadly. Because we can't take Siri's word for it. Literally.