Forget the new Siri: Here's the advanced AI I use on my iPhone instead


Sabrina Ortiz/ZDNET

The launch of ChatGPT sparked a generative AI craze, igniting a tech revolution that has forced companies to rapidly innovate to stay competitive in this evolving landscape. 

Also: I replaced my iPhone 16 Pro with the 16e for a week – here’s everything I learned

Although Apple was late to the AI race, its launch of Apple Intelligence promised a transformative overhaul, putting Siri at the center of the Apple ecosystem as a context-aware personal assistant. However, Apple confirms this vision may take longer to materialize than expected.

When can you expect the AI-improved Siri?

On Friday, in a statement for Daring Fireball, an Apple spokesperson shared that the highly-anticipated Siri upgrades, such as more personalized Siri with awareness of your personal context that performs actions for you, will take longer than expected to be delivered to the public. The spokesperson added that the company anticipates rolling those features out in the coming year. According to the report, the statement read: 
“Siri helps our users find what they need and get things done quickly, and in just the past six months, we’ve made Siri more conversational, introduced new features like type to Siri and product knowledge, and added an integration with ChatGPT. We’ve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps. It’s going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year.”

Also: Got a suspicious E-ZPass text? It’s a trap – how to spot the scam

This confirmation comes days after a new report from Bloomberg correspondent and Apple watcher Mark Gurman revealed that people within Apple’s AI division believe that the fully upgraded, conversational version of Siri won’t reach consumers until iOS 20 — which would place the release at around 2027. 

When Apple originally showed off the concept at WWDC last June, it was marketed as a personal assistant that seamlessly integrates into a user’s existing device ecosystem to provide meaningful behind-the-scenes help. Additionally, it would finally make Siri more conversational, enabling more human-like conversation, a highly requested upgrade.

However, since then, the company has rolled out only a handful of Apple Intelligence features, most of which have low helpfulness value. For example, users with eligible phones can now access Genmoji, Image Playground, notification summaries, writing tools, voicemail transcriptions, Visual Intelligence, and a ChatGPT integration. Ultimately, all of these features have fallen short, not adding much to the everyday smartphone experience. 

Also: How to program your iPhone’s Action Button to summon ChatGPT’s voice assistant

Apple Intelligence also continues to trail behind competitors. Just last week, Amazon launched Alexa+, a conversational voice assistant with agentic capabilities that allow it to perform everyday tasks for you. It also uses your personal context and habits to provide better assistance and is coming to Alexa-enabled products already in people’s homes. 

Two workarounds

Before Amazon’s Alexa+ launch, Google and ChatGPT each unveiled their own AI-powered conversational assistants, Gemini Live and Advanced Voice Mode. These assistants understand your prompts in natural language, meaning you can speak to the AI as you would a friend. They also have multi-turn conversations, so you can keep the conversation going as long as you’d like without losing prior context. 

Both voice assistants have settings that make them easy to access from an iPhone, allowing iOS users to forgo Siri for a more conversational, AI-enhanced experience. 

ChatGPT Advanced Voice

ChatGPT’s counterpart, Advanced Voice Mode, even has on-screen and camera awareness, making its assistance multimodal and adding an extra layer of support. As an iPhone user, you can easily access the assistant from the ChatGPT app, or if you want even more seamless access, you can even map it to your phone’s Action Button to summon ChatGPT

All users can access Advanced Voice. However, the limits vary depending on your plan. OpenAI doesn’t specify the limits but does make it known that paid subscribers get more access. I use ChatGPT Plus, a plan that costs $20 per month, and I’ve never hit the limit. I have found plenty of everyday use cases for the assistant, and reach for it constantly. 

Gemini 

If you prefer to use the Google Gemini conversational assistant, you can download the Gemini app for your iPhone for free and start chatting. Activating this feature is as simple as downloading the app, signing into your Google account, and clicking the waveform icon in the bottom right-hand corner. It won’t have screen awareness yet; however, the feature will be rolling out later this month to paid Gemini Advanced users.

Also: Apple Intelligence’s true potential on iPad and Mac lies in third-party apps

As spotted by 9to5Google, the Gemini iPhone app was also just upgraded to make it even easier to use. The new lock screen widgets include a “Talk Live” widget that activates the Gemini Voice assistant with a quick tap from your lock screen. You can also add it to your Control Center, making it even more accessible anytime by swiping down on your screen. 

Ready to leave Apple’s wall garden? Many Android phones, including the Google Pixel 9 and later and the Samsung Galaxy S25 lineup, have made Gemini the voice assistant the default, enabling users to experience the AI-enhanced conversational voice assistant natively. 





Source link

Leave a Comment