Exactly one year ago today, we published an article entitled Is AppleGPT the Death Knell for Siri? It was based on rumors around “Apple GPT,” which ended up being called Apple Intelligence. Our premise was that there’s no way the famously-dysfunctional Siri survives a face-off with a GPT-based challenger.
But the question at the time was if Apple would kill Siri or give it a brain transplant. The former would mean walking away from some brand equity (and brand baggage). But more importantly, killing Siri would be cumbersome in ripping it away from its integrations across iOS, MacOS, tvOS, and other OSs.
After Apple Intelligence launched last month, we appear to have an answer. Siri will avoid being taken out to the woodshed and instead get the brain surgery treatment. This is in keeping with Apple Intelligence’s broader approach, which is to be integrated and layered into products across Apple’s ecosystem.
This is good news for billions of iThings users who’ve endured years of frustrating moments with Siri. We’ll still have to wait until Fall before the brain transplant is complete but it will be worth the wait. So to anticipate that day, what are some of the specific ways that Apple Intelligence will uplevel Siri?
Raising the Bar
Before answering that question, let’s address the “dysfunctional” comment above. Siri may be great at setting timers and telling you the weather, but getting answers to more nuanced questions turns into a frustrating exchange with a sorry voice that can’t help you. Meanwhile, ChatGPT has raised the bar.
So how will Apple Intelligence elevate Siri to a level of functionality deserving of its default positioning on billions of devices? For one thing, it will tap directly into ChatGPT to answer many questions. Though this makes it more of an intermediary, it will be an upgrade from relaying most questions to web searches.
This will include useful functions and formats such as pairing questions with other media like photos, documents, and PDFs. Similar to Google’s work in multimodal search, use cases include taking a picture of a new restaurant in your neighborhood and asking Siri, “Does this place take reservations?”
Similarly, Apple Intelligence will give Siri a bit more personal context, meaning that answers will be customized versus generic (its previous approach). Examples include asking Siri to quickly find the dinner invitation and location that was emailed to you last month from your friend Bill.
Good Conversationalist
Going further under the hood, Apple Intelligence will give Siri better natural language understanding. Here again, the functionality flows from ChatGPT. In fact, OpenAI’s Spring Update event spent ample time demonstrating how GPT-4o is getting impressively good at conversational interactions.
Similarly, Siri will better handle questions about Apple’s own products which it was particularly bad at. One memorable Siri fail was asking it where the Control Center was launched on the latest iOS version (it had moved). Siri responded with web results for “The Control Center,” a treatment plant in San Jose.
Lastly, Siri will now be able to answer questions about what’s on your screen at any given time, as well as perform tasks across apps. For example, in the above dinner party example, you can tell Siri to find the email, text the address to another friend, then open it in Apple Maps and save the directions for later.
Bottom line: Apple Intelligence pulls personal assistant functions away from Siri and puts them in the hands of a real professional (ChatGPT). In some ways that relegates Siri to a front-end interface. But that’s where it can shine and endure less verbal abuse from users as it delivers the right answers.
