Apple has recently been adding a slew of new accessibility features to its iPhones. The company can alert users when the device hears sounds like a doorbell or crying baby. Now, the iPhone 12 Pro and Pro Max will come in super handy for blind or low vision users as Apple has debuted a new accessibility feature with the latest iOS 14 beta.
Called People Detection, the feature uses augmented reality and machine learning to inform users of other people and objects in the space around them. It sits inside the magnifier app. What’s the use of this accessibility feature and how does it work?
As the name suggests, the new iPhones will be able to alert you when other people or objects are near you. It will enable users with vision impairment navigate the world a little easily. What will the blind person see on using this feature you ask?
Your iPhone 12 Pro will always tell you whether there is a person in the camera’s view or not. If there is, you will be told their distance from you and it will update in real-time. It is also impressive that the sound is stereo and comes from the person’s direction. You can choose not to get real-time updates and instead set tones for certain distances. It can also be coupled with haptic feedback to help those with hearing impairments as well.
How did this accessibility come to be? Apple developed the ‘people occlusion’ feature as part of ARKit earlier this year. It enables the iPhone to detect people and objects so you can place virtual objects around them. This feature was combined with the LiDAR sensor measurements, which are super accurate, from the iPhone 12 Pro and Pro Max to build the People Detection feature. There’s a minor catch though.
This feature also uses the wide-angle camera on the device to cover maximum ground. It is the reason that People Detection won’t work in dark environments. Both the LiDAR and camera work together to send the alerts. Still, the iPhone 12 Pro and 12 Pro Max could be an important high-tech utility for visually impaired individuals.