How to Use Visual Intelligence on iPhone

In Short
  • On iPhone 16 models, press and hold the Camera Control button to use Visual Intelligence.
  • On iPhone 16e, 15 Pro, and 15 Pro Max, you can access Visual Intelligence using the Action Button, the Control Center, or the Lock Screen Controls.
  • You can also use ChatGPT and Google Lens within the Visual Intelligence interface in the iPhone.

Apple brought its version of Google Lens to the iPhone with Visual Intelligence, except Apple’s version is powered by Apple Intelligence. Initially locked to the devices with the physical camera control button, it’s now available on any iPhone capable of running Apple Intelligence. So, here’s how you can use Visual Intelligence on your iPhone if you have a device compatible with Apple Intelligence.

What is Visual Intelligence and How Does It Work?

Visual Intelligence is Apple’s advanced image recognition feature powered by Apple Intelligence, the company’s suite of on-device AI capabilities. With Visual Intelligence, your iPhone can understand the content of your camera view in real time.

It can help you identify objects, text, landmarks, animals, and plants. It can also recognize businesses and suggest actions based on what it sees. For example, if you point your camera at a restaurant’s name, Visual Intelligence might offer directions, reviews, or a link to make a reservation — all without needing to open another app. However, the latter is only available in the US. Even the rest of the feature might not work correctly outside the US, as the suggestions I got for the business were often incorrect.

What sets Apple’s Visual Intelligence apart is that it processes everything on-device, ensuring your data remains private. You can also use ChatGPT or Google Lens with Visual Intelligence to get more information or find objects similar to the one you’re scanning. When you use ChatGPT or Google Lens, the information is shared with the selected third party.

How to Access Visual Intelligence on iPhone

Depending on your iPhone model, here’s how you can access Visual Intelligence.

Method 1: On iPhone 16 and 16 Pro Models

Apple’s latest iPhone 16 series (barring the iPhone 16e) come with a Camera Control button. It lets you quickly open the Camera app and modify controls inside. This button also doubles up as a shortcut to launch the Visual Intelligence interface. Here’s how you can access it:

  1. Ensure that your iPhone is running iOS 18.2 or higher.
  2. Make sure you have Apple Intelligence turned on. If you haven’t enabled Apple Intelligence, see this guide to get started.
  3. Then, long-press the Camera Control button on the right edge of your iPhone to access Visual Intelligence.
shot of an iPhone 16's right edge with focus on the camera control button

Method 2: On iPhone 16e and iPhone 15 Pro Models

Visual Intelligence is available on iPhone 16e with iOS 18.3 and on iPhone 15 Pro & 15 Pro Max with iOS 18.4. So, make sure you’re updated to the required software and Apple Intelligence is enabled inside the Settings app. Now, you can access Visual Intelligence on these devices using any of the following methods:

  • Action Button: To customize the Action Button to launch Visual Intelligence, you need to go to Settings > Action Button and select Visual Intelligence. You can now long-press the Action Button to launch Visual Intelligence.
  • Lock Screen: Add the shortcut for Visual Intelligence to the bottom of the lock screen to access Visual Intelligence.
  • Control Center: Open the Control Center and press the Visual Intelligence tile to access it.

How to Use Visual Intelligence on iPhone

You can use Visual Intelligence in numerous ways, from identifying plants/ animals to saving information about an event from a flyer.

  1. Open Visual Intelligence on your iPhone using any of the methods described above.
  2. Then, point the camera at the object. Depending on the object on the screen, Visual Intelligence will suggest an action.
  3. For instance, if you point the camera at an animal/plant, Visual Intelligence will automatically detect it and list the name toward the top.
  4. You can tap on this name to view even more information about the identified plant, animal, or object.
  1. If you point it at text, Visual Intelligence will return one of the following actions based on the text in the photo:
    • Summarize
    • Translate
    • Read Aloud
    • Call a phone number, open a location on the map, start an email, go to a website, or create a calendar event for the date/ time.
  2. Since Visual Intelligence is powered by AI, the actions it suggests will vary depending on the context.
  3. You can also use Visual Intelligence to get more information about any business in front of you. You can get details such as opening/ closing times and contact information, view the menu or available services. It uses your location information to suggest businesses in the vicinity. Depending on the business, say for a restaurant, you can also view ratings and reviews, make reservations, or place orders by tapping the business name or the Camera icon.

Note:

For some interactions, you’ll need to press the Camera Control button or tap the circle at the bottom for Visual Intelligence to analyze the contents of the screen or offer more options, for instance, with text or to get more actions for businesses.
  1. Two options are present on the screen regardless of what type of content you’re interacting with: Ask and Search. In case Visual Intelligence cannot identify something, these two are the only options you’ll get.
  2. To ask ChatGPT for more information, tap on Ask at the bottom left. Then, type your request and send it to ChatGPT.
  1. ChatGPT will return a response to your request. You can also ask follow-up questions to the chatbot.
  2. Alternatively, tap on Search at the bottom right to find similar items using Google Lens.

That’s all you need to know about using Visual Intelligence on your iPhone. Apple’s interpretation of a vision-based AI is awfully convenient to access, even if it’s not perfect, allowing more people to get on board the AI train. And with ChatGPT and Google Lens integration, you can also easily get information that Visual Intelligence is incapable of providing in a jiffy.

Visual Intelligence isn’t available on my iPhone 15 Pro Max. Why?

To use Visual Intelligence on iPhone 15 Pro and 15 Pro Max, you need to update to iOS 18.4 and access the feature either from the Lock Screen Controls or the Control Center. With iOS 18.3, Apple only added the support for Visual Intelligence to the iPhone 16e.

How do I enable Visual Intelligence on my iPhone?

To enable Visual Intelligence on your iPhone, make sure that you have enabled Apple Intelligence by going to Settings > Apple Intelligence & Siri.

Comments 0
Leave a Reply

Loading comments...