Will Siri finally become useful? That’s the promise Apple laid out today at its “Glowtime” event, where the company introduced its iPhone 16 lineup — its first new iPhones to ship with AI-powered functionality, courtesy of Apple Intelligence and, later, a partnership with ChatGPT maker OpenAI.
While consumers won’t get the full impact of the Siri upgrade until Apple Intelligence launches, Apple promises it will upend the user experience by making the iPhone not just a small computer that fits in your pocket, but also a small personal assistant powered by AI.
In the near term, Siri will see more immediate improvements, including the ability to type questions to Siri instead of speaking and to engage in more natural conversations — including the occasional stumble — thanks to its richer language understanding. You’ll also be able to change the wake word for Siri through a new accessibility feature.
Developers are gaining access to SiriKit, which allows them to integrate Apple Intelligence-powered features into their own apps, the way that Apple has integrated Siri with first-party apps like Calendar, Mail, Notes, Safari, Files, Contacts, Voice Memos, Photos, Books, Freeform, Files and others.
Even Apple’s AirPods are getting an upgraded Siri experience, as users will be able to nod or shake their heads in response to Siri announcements.
To showcase its new AI functionality, Siri will receive a cosmetic makeover in iOS 18. Instead of a glowing orb at the bottom of the screen when activated, Siri will light up the edges of the iPhone in an eye-catching illumination. This will appear when Siri is active on the iPhone, iPad, or CarPlay.
Unlike new iPhones, Apple Intelligence — meaning the best of Siri’s upgrade — will be a slower rollout. Apple says the first set of features will be available in beta next month, with more features rolling out in the coming months. U.S. English will initially be supported, followed by localized English for Australia, Canada, New Zealand, South Africa and the U.K. Sometime next year, it will reach users who speak Chinese, French, Japanese and Spanish before expanding to others.
Other Apple Intelligence features like Writing Tools, Mail and Notifications summaries, and a Clean Up tool in Photos will also arrive in beta starting next month across iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1.
Apple Intelligence will superpower Siri
When Apple Intelligence arrives, Siri will be able to help with questions and commands that deal with its understanding of you and your needs. It will gain a better understanding of your personal context, allowing you to reference things like a song you streamed, an email you reviewed, a calendar appointment, or a text, for instance, instead of only responding to simple commands like “call Mom.” That means you’ll be able to ask Siri about your meeting and what the weather will be like at the meeting’s location, too, or do something like instructing it to send an email you had drafted.
“With Siri’s personal context, understanding and action capabilities, you’ll be able to simply say, ‘Send Erica the photos from Saturday’s barbecue,’ and Siri will dig up the photos and send them right off with new ways to express yourself and read them memories, along with tools to help you prioritize and focus so you can get more done,” noted Apple SVP of Software Engineering, Craig Federighi, at today’s iPhone announcement. “Apple intelligence is going to transform so much of what you do with iPhone. Apple Intelligence will be available as a free software update,” he said.
Siri will also be able to offer tech support, as its understanding of your Apple products, features, and functionality will be improved.
It will be able to help you in other ways an assistant could, too, by way of its on-screen awareness.
For instance, you’ll be able to tell Siri to add a friend’s address to their contact card after they text it to you, or ask it to edit a photo with a filter or drop that photo into another app. If a friend texts you about a new album, you can say “play that.”
You could also add a set of photos to an album using Siri, or send them to a friend. And you can have Siri summarize a transcript of a recording.
On iPhone 16 Pro models, Apple suggests that Siri’s new capabilities will be great for photographers, who will be able to ask Siri to pull up a specific shot from their library and then apply the Edit to the photo in an app like Darkroom, or even use Siri to quickly get suggestions on “how to enhance the space and achieve their vision,” the company said on Monday.
The bigger Siri upgrade, however, may not come directly from Apple, but from OpenAI, announced earlier this year at Apple’s Worldwide Developers Conference.
With the arrival of Apple Intelligence, Siri users will have the option to pose their questions related to “world knowledge” to ChatGPT instead of being frustratingly directed to the web for those questions it can’t answer.
The partnership with OpenAI helps Apple get ahead in the AI race, where it’s perceived to have fallen behind, but without taking on the responsibility or the reputational hit that comes from when AI gets things wrong or hallucinates an answer.
In time, Apple is expected to announce more AI partners as well.
Source : Techcrunch