Apple’s changing the way people are using their iPhones, thanks partly to the release of iOS 18.1 that finally rolls out all the hyped Apple Intelligence features that the company first introduced back at WWDC 2024. While this update brings helpful features like the magical abilities of Clean Up to remove objects in your photos, as well as Writing Tools to change up your style, Visual Intelligence is unfortunately missing.
However, the iOS 18.2 developer beta finally unlocks this much anticipated Apple Intelligence feature — which rumors hint at an official release sometime in December. If you have the iOS 18.2 developer beta, you can access Visual Intelligence to do a host of things using the iPhone 16’s new camera control button, such as using ChatGPT to describe what you’re looking at, look up prices on stuff you want to buy, and even find out the open hours of a restaurant.
Think of it much like Google Lens and Circle to Search, except that it’s Apple’s interpretation and is only available to iPhone 16 models running the iOS 18.2 developer beta. If you have one of them, we’ll tell you exactly how to use Visual Intelligence below.
1. Use Camera Control to launch Visual Intelligence
Launching Visual Intelligence is a breeze. You simply just need to long press the Camera Control button, which then will play the new Apple Intelligence animation that outlines the entire screen in a rainbow color to show it’s running. Just be aware that Visual Intelligence is meant to run in vertical mode, as it doesn’t really support horizontal mode just yet.
2. Point and shoot what you want to search
Once Visual Intelligence is running, you only need to point at what you want to search and then tap on the on-screen shutter button. From here, Visual Intelligence will inspect the photo and deliver an appropriate result.
The results will vary on what you’re capturing. For example, if it’s the facade of a restaurant, Visual Intelligence will show you what time it’ll close. If it’s a product or gadget, it will describe what they are. Plus, Visual Intelligence can also pull out important details, like email addresses and phone numbers — which can then be copied and pasted into another app.
3. Ask ChatGPT follow up requests
By pressing on the chat bubble icon in the lower left corner of the Visual Intelligence interface, it will lean on ChatGPT for additional details about what you’re searching for.
Not only will it describe what it’s seeing, but you can type follow-up requests that the AI chatbot will try to answer the best it can.
4. Use Google search instead
If you prefer Google to perform your visual searches, you can instead press on the search with Google button in the bottom right corner.
In addition to showing a list of similar photos to what you’ve captured, you can click on one of the images to open up another window that can show you shopping prices for what you’re searching.
When it comes to being a productivity tool, Visual Intelligence can also do things such as summarizing a passage of text you’ve captured — as well as translating words in the image if they’re in another language. These are just some of the functions of Visual Intelligence, showing how helpful it can be when it comes to searching for stuff. It’s also worth noting that you’ll need an active data connection for Apple Intelligence to work.
Apart from that, it’ll be interesting to see what Apple could change between now and when the official iOS 18.2 release is launched. Other Apple Intelligence features worth checking out with iOS 18.1 include recording phone calls, proofreading with Apple Intelligence, and even typing to Siri instead of using voice commands.