At its ongoing Google I/O developer conference Tuesday, the tech giant announced a new app called ‘Lookout’ that is expected to help the visually challenged learn about their surroundings. The app will be available on the Play Store in the US from later this year, and according to the company, will help blind or visually impaired people become more independent “by giving auditory cues as they encounter objects, text and people around them”.
According to Google, Lookout is meant to be used with a device that’s placed in a shirt pocket or hanging on a lanyard around the user’s neck, with the rear-facing camera pointing in the front. The app uses the phone’s camera to detect nearby objects, people and animals, and relays that information to the user via audio, so as to prevent them from bumping into something and getting themselves injured. The app will also be able to read aloud texts, such as ‘Exit’ signs over doors.
The core experience is processed on the device, which means the app can be used without an internet connection. Accessibility will be an ongoing priority for us, and Lookout is one step in helping blind or visually impaired people gain more independence by understanding their physical surroundings.
According to Patrick Clary, product manager for Google’s Central Accessibility Team, “Lookout delivers spoken notifications, designed to be used with minimal interaction allowing people to stay engaged with their activity”. He further states that the app “will use machine learning to learn what people are interested in hearing about, and will deliver these results more often”.
As far as the app itself is concerned, users will be able to choose from four available modes within the app – Home, Work & Play, Scan or Experimental. The final one, Clary points out, will allow users to test out features that are still being worked on, and are not quite ready to hit prime-time just yet. The Scan feature, meanwhile, will be able to read text, such as a recipe from a cookbook.