This useful Apple Intelligence camera feature is coming to iPhone 15 Pro – here's how it works


ZDNET

Apple’s Visual Intelligence skill is expanding beyond the iPhone 16. The iPhone 15 Pro will soon get custody of the feature, which digs up details on objects you snap through the camera.

In a post published on Wednesday, Daring Fireball’s John Gruber said that Apple representatives told him that iPhone 15 Pro (and presumably iPhone 15 Pro Max) owners will be able to use Visual Intelligence on their devices. 

The company wouldn’t reveal exactly when the feature would arrive beyond pointing to a “future software update.” But Gruber said he believes it’s destined for iOS 18.4, which should soon be available in beta and slated to go live in early April.

Also: How to use Visual Intelligence on an iPhone 16 to identify unknown objects

Available on all current iPhone 16 models and the upcoming iPhone 16e, Visual Intelligence is an AI-based feature designed to help you identify and learn about animals, plants, landmarks, businesses, and a host of other items you view with your phone’s camera. Part of Apple Intelligence, the tool is quick and easy to use.

Just aim your phone at the item that you want to investigate. Upon long pressing the Camera Control on any iPhone 16 model, you can either run a Google search on the object or ask ChatGPT-specific questions about it. In response, the AI presents you with the requested details.

There’s only one hiccup. The four existing iPhone 16 models use the physical Camera Control to trigger Visual Intelligence. That button doesn’t exist on the iPhone 15 Pro, or the iPhone 16e, for that matter. No problem. Instead, users of the 15 Pro models and the 16e will be able to trigger Visual Intelligence by pressing the Action button. Nestled above the volume controls, this button is customizable, so you can set it to a variety of actions.

And there’s more, according to Gruber. Apple is also adding a Control Center button that will launch Visual Intelligence. That means you’ll be able to activate it just by swiping down on the screen and tapping the appropriate button. That option is headed for the iPhone 15 Pro models and presumably the iPhone 16 series.

Also: I bought an iPhone 16 for its AI features, but I haven’t used them even once – here’s why

I welcome the news that Apple is expanding Visual Intelligence. I often use the feature on my iPhone 16 Pro and find it quite helpful. Depending on the object I scan, sometimes the information I get is too generic. But I can ask ChatGPT more than one question to zero in on the details I want.

If Visual Intelligence for the iPhone 15 Pro and Control Center are slated for iOS 18.4, then iOS beta users should have a chance to try out these features fairly soon.





Source link

Leave a Comment