Apple Announces Live Captions, Door Detection, and More New Accessibility Features

Apple already has various accessibility features for its devices to enable physically disabled people to use Apple devices with ease. But now, ahead of Global Accessibility Awareness Day, which is tomorrow, the Cupertino giant has announced a bunch of new accessibility features for iPhones, iPads, and Apple Watch users. Check out the details below.

Apple Announces New Accessibility Features

Apple recently released an official blog post to announce the new accessibility features for iPhones, iPads, and Apple Watch. These include more object detection options such as closed doors within its Magnifier tool, a new Apple Watch Mirroring feature, and a new Live Caption feature for those who are deaf or have hearing issues.

Live Captions

Starting with the Live Captions feature, users who have hearing issues or are completely deaf will benefit from this feature. This feature was introduced by Google with Android 10 and Apple has finally caught up with the Mountain View giant with the recent addition. Microsoft is also in the race as it added Live Captions to Windows 11 recently.

The Live Captions feature is supported on iPhones, iPads, and Mac, and enables users to turn on live subtitles for any kind of audio content, whether it is a FaceTime or a video call (header image), a video that is playing on a device, and even for one-to-one conversations. Furthermore, when using Live Captions for calls on Mac, users can type out a response and instantly read it aloud to the participant(s) of the call.

Door Detection

Coming to the Door Detections feature, it is essentially an addition to the Magnifier feature on iPhones and iPads that now enables users to detect a closed or open door ahead on their path. It uses the LiDAR sensor on supported iPhone and iPad models to detect the door and describe its attributes to the users.

Apple Announces New Accessibility Features

The feature detects whether a door is open or closed. If closed, it also detects whether it can be opened by pushing, pulling, or turning a knob. It even reads out the texts, signs, and symbols like room numbers on doors to read them aloud to the user. Door Detection can be used alone or with the Magnifier’s existing Image Description and People Detection feature. However, it is worth mentioning that as it uses the LiDAR sensor, the feature is only supported by iPhone and iPad models sporting the said sensor.

Apple Watch Mirroring

Next is the Apple Watch Mirroring feature that allows users with physical and motor disabilities to use their Apple Watch using their connected iPhone. The feature uses Apple’s AirPlay and allows users to remotely control the Apple Watch using iPhone’s assistive features like Voice Control and Switch Control.

Furthermore, it can use inputs like voice commands, sound actions, head tracking, and third-party MFi-certified switches as alternatives to tapping the Apple Watch display.

Apple Announces New Accessibility Features

Other than these, Apple has added support for 20 new languages for its VoiceOver feature. These include languages such as Bengali, Bulgarian, Ukrainian, Catalan, and Vietnamese. These new languages will also be available for other accessibility features like Speak Screen and Speak Selection.

Additionally, Apple has announced to add more accessibility features for Siri, sound recognition, and more. There is also a new Buddy Controller feature that lets users connect two game controllers to combine them into a single one. It is to allow disabled users to play games with the help of their friends and family members on supported Apple devices.

So, what do you guys think about these new accessibility features for iPhones, iPads, Macs, and Apple Watch? Let us know your thoughts in the comments below, and stay tuned for more informative new stories.

Leave a Reply