Apple Unveils iOS 18 Accessibility Features Including AI-Powered Eye Tracking

Apple iOS 18 accessibility features
In Short
  • Apple previewed some exciting accessibility features arriving later this year with iOS 18, iPadOS 18, and visionOS 2.
  • iOS 18 and iPadOS 18 accessibility features include Eye Tracking, Music Haptics, Vocal Shortcuts, Vehicle Motion Cues, and CarPlay improvements.
  • Eye tracking will allow iPhone and iPad users to navigate their devices with just their eyes.

Apple today announced new accessibility features arriving later this year with the upcoming software updates like iOS 18, iPadOS 18, and visionOS 2. The Cupertino tech giant never fails to impress its users when it comes to Accessibility features. This time again, Apple announced some useful accessibility features to help users with any disabilities.

Apple previewed several useful features for the iPhone and iPad, the one that would grab the most attention is Eye Tracking. Yes, one of the most remarkable features of Apple Vision Pro is making its way to iPhone and iPad.

Powered by Artificial Intelligence, Eye tracking will allow iPhone and iPad users to navigate their devices with just their eyes. This is similar to what we have seen on the Vision Pro. Eye tracking on iPhones and iPads is designed especially for users with physical disabilities so they can control their devices just by looking at them.

This magical feature uses the front-facing camera to determine what element the user is looking at. Users can gaze at a button to highlight it and then hold their gaze for a few seconds to select it. Thanks to the on-device machine learning, all data is stored securely on the device and isn’t shared with anyone, not even Apple.

It’s worth knowing that Eye Tracking works across iOS and iPadOS apps, and you don’t need any additional hardware or accessories. You can use the Dwell Control to activate any element and also access additional functions such as swipes, swipes, physical buttons, and other gestures solely with your eyes.

“We believe deeply in the transformative power of innovation to enrich lives. That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”- Tim Cook, Apple CEO

Besides Eye Tracking, Apple also brings Music Haptics to allow users who are deaf or hard of hearing to enjoy music on the iPhone like everyone else. When enabled, the Music Haptics feature will allow the Taptic Engine in the iPhone to play taps, textures, and refined vibrations to the audio of the music. 

Apple Unveils iOS 18 Accessibility Features Including AI-Powered Eye Tracking
Image Courtesy: Apple

Next up, there’s a Vocal Shortcuts option so that iPhone and iPad users can choose custom utterances that Siri understands to launch shortcuts and complete complex tasks.  Apple also announced a new Vehicle Motion Cues feature that can help reduce motion sickness for passengers in moving vehicles.

Later this year, CarPlay will include a bunch of new features. First is Voice Control which allows you to navigate CarPlay and control apps with just your voice. Passengers or drivers who are deaf or hard of hearing can turn on alerts to be notified of car horns with the help of the Sound Recognition feature. People who are colorblind can go for Color Filters to make the CarPlay interface visually easier to use, with Bold and Large Text.

Besides the iOS and iPadOS, Apple previewed some accessibility features to visionOS as well. This includes system-wide Live captions, Reduce Transparency, Dim Flashing Lights, Smart Invert, and more.

We’ll get the first preview of iOS 18, iPadOS 18, and visionOS 2 at Apple’s WWDC 2024 keynote event which kicks off June 10.

comment Comments 0
Leave a Reply