Salvaging Siri: Apple Intelligence Comes to the Rescue

The wait is over (sort of). Apple Intelligence is now live with this week’s iOS 18.1 general availability. This will mean many things, as Apple Intelligence is more of an embedded set of system functions than it is a feature or app. But the most important thing it could do is to rescue Siri from its morass of mediocrity.

Backing up for context, we published an article last year entitled Is AppleGPT the Death Knell for Siri? It was based on rumors around “Apple GPT,” which ended up being called Apple Intelligence. Our premise was that there’s no way the famously-dysfunctional Siri survives a face-off with a GPT-based challenger.

But the question at the time was if Apple would kill Siri or give it a brain transplant. The former would mean walking away from some brand equity (but mostly brand baggage). More importantly, killing Siri would be cumbersome in pulling it away from embedded functions in iOS, MacOS, tvOS, etc.

After Apple Intelligence was unveiled in June, we got our answer: Siri will not only get a stay of execution but a brain transplant care, of Apple Intelligence. This is in keeping with Apple Intelligence’s broader approach noted above, which is to be integrated and layered into products across Apple’s ecosystem.

This is good news for billions of iThings users who’ve endured years of frustrating moments with Siri. We’ll still have to wait for all of Apple Intelligence’s functionality to be integrated (the reason we said “sort of” earlier). But a decent amount of new functionality has already been added in iOS 18.1.

Apple Infuses Business Connect Deeper into its Ecosystem

Mark the Occasion

So to mark this occasion, what are the ways that Siri is improving with Apple Intelligence? We’ve been reading up on several of its new capabilities and have synthesized a few of them here to save you time. Most of these are out already and some will be released over time as Apple Intelligence is infused.

First, in a general sense, Siri will tap directly into ChatGPT to answer many questions – a function of Apple’s partnership with OpenAI. Though this makes it more of an intermediary, it will be an upgrade from relaying most questions to web searches, and will meet a standard that users have come to expect.

This will include increasingly-demanded functions such as pairing questions with other media like photos, documents, and PDFs. Similar to Google’s work in multimodal search, use cases include taking a picture of a new restaurant in your neighborhood and asking Siri, “Does this place take reservations?”

As a side note, Multimodal AI continues to gain momentum and is developing user habits and expectations. For example, multimodal AI in Ray Ban Meta Smartglasses lets users look at an object (visual), ask what it is (audible), and get an answer back (audible) from Meta AI.

Apple Intelligence will take things a step further for more personal context. Deviating from Siri’s previous generic approach, answers will be customized for your data (which Apple has). For example, you can ask Siri to find the dinner invitation and location that was emailed to you last week from your friend Matt.

Read AI Closes $50 Million Series B, Expands Product

Step Aside

Going further under the hood, Apple Intelligence will give Siri better natural language understanding. Here again, the functionality flows from ChatGPT. In fact, OpenAI’s Spring Update event spent ample time demonstrating how GPT-4o is getting impressively good at conversational interactions.

This includes being able to handle questions about Apple’s own products which it was paradoxically particularly bad at. This includes asking Siri to help you find your WiFi password on your phone, or where the flashlight shortcut is in the iOS Control Center. These are questions that come up often.

Similarly, Siri will now be able to answer questions about what’s on your screen at any given time, as well as perform tasks across apps. For example, in the above dinner party example, you can tell Siri to find the email, text the address to another friend, then open it in Apple Maps and save the directions for later.

Bottom line: Apple Intelligence pulls personal assistant functions away from Siri and puts them in the hands of a capable AI engine. In some ways that demote Siri to a front-end interface rather than a functional back-end. But that’s where it can shine as it steps aside to let a professional take over.

As this occurs, Siri can begin the process of shedding its bad reputation. Though it was always great at setting timers and reciting weather, getting answers to more nuanced questions turned into a frustrating exchange with a sorry voice that can’t help you. Starting this week, those days are evidently over.

Share Article...

Follow Us...

Stay ahead of the curve and get the latest on Local straight to your inbox.

By submitting this form, you agree to receive communications from Localogy. You can unsubscribe at any time.

Related Resources

Thryv Acquires Keap for $80 Million to Expand Product and Market Share

SMB-marketing-focused SaaS leader Thryv has announced the acquisition of CRM and marketing automation platform Keap. The transaction is valued at $80 million, all in cash and subject to customary adjustments, and is expected to close in the fourth quarter, subject to standard closing conditions. 

Hummingbirds Scores $5.4 Million in Fresh Funding

Localized influencer network Hummingbirds has announced a $10 million seed round that includes existing and new investors like Allos Ventures and M25 Fund. This brings the company’s funding to date to $14.4 million by our count, and comes less than a year after its last funding round of $3.3 million. 

Yelp Documents the Transformation of the American Shopping Mall

The role of the Shopping Mall – once a defining venue of American culture in the 80s and 90s – continues to transform. Despite the bloodletting of retailpocalypse, the shopping mall hasn’t gone away. But it has morphed considerably in both “form and function,” according to a new report from Yelp.