I bought an iPhone 16 for its AI features, but I haven't used them even once – here's why


Jason Hiner/ZDNET

I am a user of tools. I work a great many hours each week. Anytime I find a tool that helps me meet my deadlines, do something I otherwise couldn’t do, or get a little more sleep or personal time, I add it to my kit bag, even if that means spending some money. For example, I’ve invested in an entire army of camera robots because they help me turn out better and more interesting videos.

When Apple Intelligence was announced, and its big features were tied to the iPhone 16 series, I decided it was time to upgrade my four-generation old iPhone 12 Pro Max. I bought my iPhone 16 Pro Max specifically for the productivity benefits I expected to get from Apple Intelligence.

Also: Why you should power off your phone once a week – according to the NSA

The fact that I haven’t touched any of the Apple Intelligence tools since they were released with iOS 18.1 and iOS 18.2 is evidence that I haven’t found a reason to. Jason Perlow was right. Of course, Jason is almost always right, which is one of the many reasons I’m glad I get to work with him here at ZDNET. 

In this article, I’m going to go down the list of Apple Intelligence features in iOS 18.1 and 18.2 and share with you why I’m not using them. I did test Apple’s AI features while writing, just to confirm aspects of what I was writing. But I haven’t found any actual productive use for any of them.

Let’s first discuss privacy

Before I go down the list of tools, let’s discuss privacy. Apple has made a fairly big fuss about privacy, specifically the way it uses its own private cloud to keep your AI requests private. While most Apple Intelligence AI processing is done on the phone, some aspects that require cloud capabilities are isolated specifically for privacy protection.

Also: iOS 18.2 was killing my iPhone’s battery until I turned off this feature

This is good, and if I were deeply concerned with my privacy, I would probably choose to use Apple’s tools over those of ChatGPT. But Apple does send queries to ChatGPT, and while they are anonymized, the content of those queries will go into ChatGPT’s information maw to be cataloged and processed in undisclosed ways.

I’ve lived my life in the public eye for a few decades now, so Apple’s privacy solution isn’t all that important to my personal needs. If it were, I probably would find Apple Intelligence’s implementation of features like writing tools to be a more valuable offering.

Writing tools

iOS 18.1 introduced the ability to proofread, summarize, and rewrite text on your phone. iOS 18.2, released in December 2024, added the ability to compose text based on a prompt.

These features have been table stakes in every chatbot released since 2022. There is nothing particularly “Apple” about the AI here, except that some of the processing is done on-device.

Personally, I don’t write on my phone. Way back in the PalmPilot days, I did. It was exciting to be able to take my little Palm device and a folding keyboard to a coffee shop and get work done. But laptops were big and heavy back then, and iPads didn’t exist. Now, if I expect to be out and need to do some writing, I take my MacBook Air or an iPad with a keyboard case.

Also: How iOS 18 turned my Apple Watch into the productivity tool of my sci-fi dreams

ZDNET does not allow articles to be created with AI help, so no matter what device I have, I wouldn’t use Apple Intelligence writing tools for my articles. The text I mostly produce while on the go, especially when I just have my phone, consists of Slack responses, text message responses, and email responses.

Those can all just be dictated using tools that have been in iOS for years. Apple Intelligence adds nothing in those contexts.

There is one writing tool that has proven to be a game changer for me, but it came out in iOS 18, before Apple Intelligence was officially released. That’s the ability for iOS to transcribe voice memos. I described how that feature transformed my Apple Watch into a powerful dictating and writing assistant. That feature was a win, but it’s not officially tied to Apple Intelligence.

  • Writing tools: Might help some folks, no real value to me.

Siri-ously?

OK, yes, Siri is now prettier. That new color band around the iPhone is kind of nice. But does it add any real productivity value? No.

In April, I showed how Logitech introduced a little free add-on software app that allows you to send prompts to ChatGPT. Because it ties nicely to a mouse key, it’s a tiny convenience when working throughout the day. All it does is provide a dialog to accept a prompt, send the prompt to ChatGPT, and then display ChatGPT’s results in a window.

Also: How I set ChatGPT as Siri’s backup – and what else it can do on my iPhone

Apple Intelligence’s Siri “upgrade” barely provides as much functionality. If you turn on ChatGPT in Settings (you can also link your ChatGPT Plus account), when Siri can’t answer a question, it will ask if you’d like to send the question to ChatGPT. If you approve that action, it passes your query to ChatGPT, and you get back a text-based response in a little pop-up.

Ooh. Ahh.

The ChatGPT app on the iPhone is vastly better. You can have entire audio discussions with ChatGPT via the phone’s speaker and microphone. Using AI hands-free is a powerful way to benefit from AI, but with Siri’s ChatGPT integration, you just get a little pop-up with text.

Also: How to program your iPhone’s Action Button to summon ChatGPT’s voice assistant

Oh, and now you can thumb-type into Siri instead of just dictating. Ooh. Ahh.

I mean, what the heck, Apple? All this time, delaying features for three months after the phone launches, and the best you have is passing on a query through an API call? Freshman programming students were doing that back in 2023. Your devs could have learned how to do this in one free course, over a weekend.

  • Siri integration: It looks cool, but that’s it. No added value. Yawn.

Notification Summary and Reduce Interruptions Focus Mode

These two go hand-in-hand in my mind. Both are designed to help you manage the incoming flow of notifications, surfacing only the notifications you want to see and summarizing the details to make it easier for you to take action.

But… you’re relying on a provably fallible AI to manage your notifications. That’s fine if you want to reduce the number of spam notifications about sales from Harbor Freight Tools, for example. But it’s not fine if you want to make sure you get important work and family notifications.

Also: What’s really destroying your productivity – and 3 simple ways to focus better today

iOS has long had better tools for this. First, I’ve turned off notifications from all apps where I really don’t want to see notifications. This approach doesn’t work for Instacart, because while a shopper is out shopping, I want those notifications. I don’t want the useless sale promos, especially when they promote foods I don’t eat.

In Instacart’s case, I just live with the extra notifications. But because I’ve turned off most notifications from most apps, the Instacart notifications aren’t overwhelming. My concern with using the AI would be that Apple Intelligence would deliver the notification that there’s a sale on Twinkies but not the notification that my shopper wants to know if he can replace fresh strawberries with a pint of strawberry ice cream.

Another tool I use is found in the Ringtone area of Contacts. If I want to make sure a family member can reach me, I turn on Emergency Bypass. That feature ensures that even if my phone is muted or in a do-not-disturb focus mode, my most important contacts can get through.

These approaches are far more reliable than hoping Apple Intelligence can figure out what messages I really want to get and which I want to ignore.

  • AI notification management: There are better and more reliable tools available

Priority email notifications and mail categories

Hey, Apple. Welcome to 2013. I use the Gmail app on my iPhone. None of the Apple Intelligence features targeted at email work in the Gmail app. That’s OK because new Apple Intelligence features like mail categorization and priority email notifications have been available in Gmail since roughly 2013.

Also: This easy email trick will make your inbox clutter vanish – automagically

I have been training my email inbox for years. When I get an email that’s not highly important, I move it out of the Primary category. Gmail now knows to send priority messages to Primary, all the promotions and press releases I receive to Promotions, all my family-related correspondence to Social, all status-related notices to Updates, and all tech support requests to Forums. Once in a while, Gmail puts the wrong thing in the wrong category, and a simple drag-and-drop trains it on what’s right.

Then, in my Gmail app, I have my email notifications set to high priority only. That way, the Gmail app only sends me notifications when I get a high-priority email.

  • AI email management: Not available for Gmail, where functionally equivalent features have been available for years.

Photo-related features

I find the Remove Background feature available from the Finder quite useful. But it’s been around since iOS 16 and is not part of Apple Intelligence.

Also: I use Photoshop’s AI tool every day – here are my 5 essential tips for the best results

Apple does offer a new Photos Clean Up feature, which is basically a generative fill feature. It removes the selected object and fills the now-open space with what would have been behind the image.

I much prefer to tweak my photos in Photoshop, but I can see this as a powerful tool for those traveling and simply wanting to upload a slightly cleaned-up image to a social media account. I’m not that user, so while helpful, I don’t use it.

Now, that brings us to the custom memory movie maker. I don’t like this. I’m not the target user of this custom memory movie thing. Perhaps if I had a baby and wanted to share a set of pictures with the grandparents, it might apply. But for me, it might as well just not exist. I don’t like it when apps decide to blindside me with past memories when I’m not in the mood to reminisce.

Then there’s Apple Intelligence’s feature that allows you to describe a photo or video and get intelligent results back. Searching for “Pixel under a blanket” resulted in Photos identifying that Pixel is my pet, but none of the pictures returned showed him under a blanket. For the record, we have a lot of pictures of Pixel curled up in the blanket.

img-8712

It did not find this or the hundreds of other Pixel under a blanket picture we have taken. This was taken with the iPhone 16 Pro Max.

David Gewirtz/ZDNET

On the other hand, searching for “3D printer” did return a lot of results with 3D printers in the images or videos. I do a lot of 3D printer-related content on my YouTube channel, and Apple Intelligence did find those images. It was not able to separate out images of a certain brand of printer, but at least it’s a start.

  • Photo features: Mixed bag, where some features are better in Photoshop, some I don’t use, and some that are unreliable.

Image Playground, Image Wand, and Genmoji

These are three AI image generation tools, and they’re… cute.

Also: This hidden Apple feature turns your iPhone or iPad into an AI image generator

I tried my hand at Image Wand, which takes an image you draw in Notepad and turns it into a more usable picture. The gotcha is that it requires a prompt as well as the image. I “drew” a picture of my dog Pixel, then used the wand tool to select it and prompted “Pixel.” I didn’t tell the AI my picture was a dog, and you’d be forgiven if you couldn’t tell, either. Even so, it did generate a dog picture that was loosely similar to my pup.

drawing

Screenshot by David Gewirtz/ZDNET

I tried again, simply drawing a circle and then prompting as “Pixel.” This time, the AI generated a few different rings but no dog.

Image Playground takes the idea a step beyond, delving into DALL-E 3 and Midjourney territory, but in a far more limited way. It only allows for three graphic styles. I generally don’t expect to use this tool instead of Midjourney, which I pay for. That said, it’s possible that Apple’s images aren’t generated based on a publicly available dataset, so they may not be copyright violations waiting to happen. I don’t yet have any good details on how Apple is constructing its images, but this is an area we will be following.

And then there’s Genmoji. Genmoji is a tool that lets you create an emoji from an idea or prompt you type in. It might be fun, but I really can’t see myself using it for anything. I’m not a big emoji user, so I’m again probably not the target market.

  • Image creation tools: Other tools are much more rich and useful, these are cute, but serve no real practical purpose in my workflow.

Visual Intelligence

Look, as someone who spent years in product marketing, I’m as appreciative of a branding scheme as anyone. But Visual Intelligence, which is supposed to be a big feature of the iPhone 16 and iPhone 16 Pro, is a giant nothingburger.

So, here’s how it works. You can only access it using the Camera Control button on the iPhone 16 or 16 Pro. Why? Because some product manager figured people might buy an iPhone 16 to get it.

All it does — all it does — is let you snap a picture and then send it to ChatGPT for analysis or Google to search for similar images. That’s right, Apple Intelligence’s vaunted Visual Intelligence feature is a shortcut. That’s it.

You don’t need an iPhone 16 or 16 Pro to get features like this. Google Lens will run on any iPhone that runs iOS 15 or later. That’s an iPhone 6s or newer. An iPhone 6s! Google Lens does all that Visual Intelligence does, and more.

Also: 5 Google Lens tricks to level up your image search

And then there’s the ChatGPT app. All you need to do is open the ChatGPT app, hit the plus sign, take a picture, and then give ChatGPT a prompt. ChatGPT will do the rest. ChatGPT will run on an iPhone 8 or newer. Just make sure you’re running iOS 16.1 or later.

  • Visual Intelligence: Unnecessarily predatory marketing hype. C’mon, Apple. You’re better than this.

The final verdict

It’s possible that some of you may find some of the Apple Intelligence features compelling. I know there are at least two or three of you out there who spent an entire weekend creating custom emoji. Don’t worry — your secret is safe with me.

But none of these features are compelling. Worse, few are even useful. And Visual Intelligence is hype without substance, tied to selling a higher-end model of phone.

I do give a few points for the privacy argument, and I have found value in Voice Notes transcription, but otherwise, nothing here will benefit my mobile life whatsoever.

For a company founded on the principle of “insanely great,” I think Apple Intelligence is leaning way too heavily on that old reality distortion field, but Apple no longer has anyone with the chops to keep the field operating long enough to fool anyone.


You can follow my day-to-day project updates on social media. Be sure to subscribe to my weekly update newsletter, and follow me on Twitter/X at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, on Bluesky at @DavidGewirtz.com, and on YouTube at YouTube.com/DavidGewirtzTV.





Source link

Leave a Comment