Many a time, we come across a situation where extracting text from an image becomes the need of the hour. Whether it’s a quick online search, sharing certain information, or digitizing things like phone numbers and emails, we have to rely on OCR apps such as Google Lens to get the job done. While it’s okay to use a third-party OCR app to do the needful, it’s always better to have a native alternative. And exactly that’s what has made Live Text a fan favorite in iOS 15. Should you wish to go hands-on with this new feature, read on to learn what is Live Text and how to use it in iOS 15 on iPhone and iPad.
Use Live Text in iOS 15 on iPhone and iPad (2021)
Here, we not only explain how the Live Text features works in iOS 15 but also teach you how to use the feature via multiple methods, including the camera app, Photos app, and more. So, let’s get down to it!
What is Live Text in iOS 15 and How It Works?
Live Text is a classic example of how smartly Apple takes inspiration from third-party offerings and comes up with something more efficient and deeply integrated into the Apple ecosystem. If Night Mode on iPhone 11 series seemed fascinating despite coming late to the party, AirPods didn’t take much time to bring about a resolution in the true wireless earphones segment. The Cupertino giant’s repertoire of coming from behind and adding a more compelling alternative to an existing app or feature is well-known. And Live Text inherits all the characteristics that Apple is known for.
Google Lens has long been a top-notch app for recognizing objects, animals, extract text from an image, and more. However, it isn’t as integrated into the Android experience as Live Text is in iOS 15, iPadOS 15, and even macOS Monterey. The Live Text feature works the same as Google Lens and makes extracting text from images (or almost anything around you) a pretty intuitive affair. Just point your camera at any object to extract a phone number, email address, directions in Apple Maps, and more.
You can use Live Text in the native Photos and Camera apps as well as inside other apps like Safari, Messages, WhatsApp, and more. Plus, you can also bring it up from Spotlight to quickly look for anything that catches your eyes. To be more precise, you can trigger Live Text from any text input field on your iPhone and iPad. Thus, it wouldn’t be wrong to say that Live Text in iOs 15 is one of the best OCR (Optical Character Recognition) powered features out there.
Live Text Supported iPhones and iPads
Unfortunately, not all iOS 15 compatible iPhones and iPads support Live Text. So make sure you have a compatible iOS device to use this feature.
Here is the List of Live Text Compatible iPhone Models:
- iPhone Xs
- iPhone Xs Max
- iPhone XR
- iPhone 11
- iPhone 11 Pro
- iPhone 11 Pro Max
- iPhone SE 2
- iPhone 12
- iPhone 12 mini
- iPhone 12 Pro
- iPhone 12 Pro Max
Here is the List of Live Text Supported iPad Models:
- iPad Pro 2018
- iPad Pro 2020
- iPad Pro 2021
- iPad 8th-gen
- iPad Air 3
- iPad Air 4
Note: Live Text also supports MacBook Air with M1, MacBook Pro with M1, Mac Mini with M1, and the new 24-inch iMac 2021.
Live Text Supported Languages
Currently, Live Text in iOS 15 recognizes seven languages, including English, French, Spanish, German, Portuguese, Italian, and Chinese. Apple is expected to add support for more languages in the coming months.
How to Use Live Text in iOS 15
Method 1: Use Live Text in Camera App on iPhone and iPad
Live Text is neatly integrated into the stock camera app on iOS 15. So, whenever you come across something you want to look up, copy, share or translate, you can do so with utmost ease.
1. Launch the Camera app on your device and point it at an object or text. After that, look for the “Live Text” icon and tap on it. Do note that the Live Text icon shows up in the bottom right corner when in portrait orientation and the bottom left in the landscape orientation.
2. Now, Live Text will recognize the text in the image. You can then select the text that you want and use one of the following options:
- Copy: Tap it to copy the selected text.
- Select All: Tap it to select all the extracted text.
- Look Up: Tap it to find out the meaning of the selected word or search on the web.
- Translate: Tap it to translate the selected text.
- Share: Tap it to share the extracted text via email, iMessage, or any other app. You can also save the text in the Files app.
It’s worth pointing out that when you tap on a phone number, a new context menu will appear with multiple options like FaceTime, send a message, add to contacts, and copy. Likewise, if you tap on an email address, you will get the option to email right away.
Method 2: Use Live Text in Photos App on iPhone and iPad
Live Text works in the Photos app just as seamlessly. So, if you ever come across any image (including screenshots) and wish to extract text from it, you can get it done with ease. Just follow the steps below:
1. Open the Photos app on your iPhone or iPad. Then, navigate to the image you want to extract text from.
2. Now, tap the tiny Live Text icon (looks like a square viewfinder with three lines) at the bottom right corner of the screen. Live Text will instantly recognize all the available text in the image.
3. Next, select the text you want to copy, share, look up, or translate.
Method 3: Use Live Text to Recognize Handwriting in iOS 15
Live text is equally efficient in terms of extracting text from a handwritten note. If you wish to digitize your notes to keep them secure and accessible across devices, Live Text OCR can come in super handy.
Simply launch the camera app on your iOS 15 device. Then, allow the viewfinder to appear around the handwritten note. After that, tap the Live Text button to convert the handwriting into text. After that, select the text and then do the needful.
Use Live Text Inside Apps Like Apple Messages and WhatsApp in iOS 15
Imagine you are in the midst of a conversation with your colleague or friend. Suddenly, you need to extract text from an image and share it. Wouldn’t you want to get everything done without having to leave the conversation thread? Yes, you would. That’s the flexibility iOS 15 offers by letting you invoke Live Text from inside a messaging app.
Open a messaging app like Apple Messages. Then, tap and hold the text input field. After that, tap the “Text from Camera” option. Now, use your iPhone/ iPad’s camera app to extract the text from an image and share it without ever leaving the messaging app. Pretty cool, isn’t it?
Use Live Text in Spotlight Search on iOS 15
Spotlight has got many interesting hacks up its sleeve, including the ability to let you use Live Text in iOS 15. To use the feature, swipe down from the middle of the home screen to bring up Spotlight Search. After that, touch the text input field and select the “Text from Camera” option. Now, go ahead and do the needful as explained in the steps above.
Use Live Text to Search in Safari on iOS 15
Having a smart OCR tool accessible within a browser at all times means you can use it to recognize text and get started with the search without wasting any time. So, the next time you come across something that makes you curious, do not forget to bring up the Live Text OCR feature in Safari on iOS 15.
To use it, tap and hold the search bar textbox and then select the “Text from Camera” option. Now, follow the steps detailed above to extract text from an image or real-life object and get started with a quick search.
Check File Size of the Selected Live Text Before Sharing
Interestingly, iOS 15 also lets you check the file size of the selected text before sharing. It could come in handy when you want the file size of the selected text to be minimal for smooth sharing.
- Once you have used the Live Text feature to recognize text on your iPhone or iPad, select the text you want to share. Then, tap on the “Share” button in the pop-up context menu.
2. Now, choose the “Show File Size” option from the iOS 15 Share sheet. Voila! It will instantly show the file size of the selected text at the top of the screen.
Translate Selected Text Offline Using Live Text in iOS 15
Notably, you can also translate selected text offline. Save it for times when your iOS 15 device is not connected to the Internet, but you want Live Text to continue to work well, and without any hiccups. Bear in mind that offline translations may not be as accurate as online translations.
To get started, make sure you have enabled on-device translation. Also, ensure that you have downloaded the languages that you want to translate.
- Open the Settings app on your iOS 15 device. After that, scroll down and move to “Translate“.
2. Now, turn on the toggle next to the “On-Device Mode” option.
3. Also, make sure to download the languages you want to translate. We have listed the supported languages at the start of this article. Next, tap Downloaded Languages and download the preferred languages so that they are available offline.
4. That’s it! From now onwards, you will be able to translate languages even when your iPhone or iPad is not connected to the Internet.
Apple Live Text Vs Google Lens: Which is Better?
While Google Lens has been around for several years, Apple’s Live Text feature has just hit the ground. So, in terms of experience, Google’s offering takes the cake. Not only that, Google Lens also has an upper hand when it comes to universal compatibility.
Unfortunately, that’s not the case with Live Text as it’s integrated into iOS 15, iPadOS 15, and macOS Monterey. Another department where Lens is ahead of the curve is translation support for a whopping 103 languages as opposed to Live Text’s compatibility with seven languages.
But what tilts the tide in favor of Live Text is the unmatched system-wide integration into the Apple ecosystem. And that is likely to get even better with time. That leaves us to the final frontier: Efficiency. To ward off the temptation of a surefire conclusion, I would direct you to Akshay’s in-depth comparison between Apple Live Text and Google Lens.
Tips to Use Live Text on iPhone and iPad Like a Pro
That’s all about Live Text in iOS 15! Even though iOS 15 is still a work in progress, Live Text works quite reliably. And by the time Apple eventually rolls out the latest iteration of iOS this fall, it should become even better. Google Lens may have years of expertise under its belt, but Apple’s Live Text seems to have a clear edge in ease of use and intuitiveness – at least for now. By the way, what’s your take on Live Text in iOS 15? Share your thoughts on the coolest iOS 15 features, including Focus mode, ability to drag and drop files across apps in iOS 15, FaceTime calls to Android, Apple Digital Legacy, and more.
It is simply unfair that Apple is not bringing the LifeText feature to older iPhones. If jailbroken iPhones can give this features why not iOS15. The same goes for Split Screen in IOS. One would have thought that in the least, Split Screen should available to the Plus and the Pro Max range of iPhones models.
I am using iPad Air 4th generation installed iPad OS 15 Beta 2. But the live text icon is not showing in camera or photos app. Please help