Google uses its AI expertise to help the blind explore their surroundings

Doris Richards
March 14, 2019

Sound Amplifier - which was also announced at last year's Google I/O - uses a phone and a set of headphones to filter, augment, and amplify sounds so that users can better hear conversations or announcements in noise-heavy environments. But the search giant occasionally uses its extensive knowledge of things like artificial intelligence and machine learning for a greater good, helping those less fortunate to leverage modern technology in a truly life-changing way.

Google has launched a new app called Lookout, which is created to give the visually impaired verbal information about their surroundings, using artificial intelligence technology.

Google Lookout app is finally here. The app starts in this mode by default. The Shopping mode is meant to help with barcodes and currency, while the Quick read mode is best for sorting mail as well as reading signs and labels. The app also has a camera view for live recognition.

A screenshot image of Lookout's modes, including "Explore", "Shopping", and "Quick read", as well as a second screenshot of Lookout detecting a dog in the camera frame.

Unlike other apps, Lookout doesn't require further tapping: Once it's opened, users just have to keep their phone pointed forward for Lookout to warn them about nearby items.


It is now available at the Google Play Store for all Pixel devices running Android 8.0 Oreo and is currently only available in English.

Google advises users to hold or wear the device (hanging a Pixel phone from a lanyard around the neck or placing it in a front shirt pocket), for easy Lookout access.

Google Lookout is now available for the Google Pixels smartphones, and only within the US.

For what it's worth, Google's "hope" is that availability will be expanded "soon" to "more devices, countries, and platforms", although there are no words on a timetable of any sort.

Other reports by Iphone Fresh

Discuss This Article

FOLLOW OUR NEWSPAPER