For the app to work, the user is recommended to wear their smartphone around the neck (with a lanyard) or in a shirt pocket. Before any of this happens, we need to await the program to get there on the Play Store.
Google is developing an app it hopes will help the millions of blind and visually impaired people in the world become more independent. Their newest app called Lookout will be available later this year but they took the time to make an official announcement at the ongoing I/O conference. How does the app works? Depending on the mode, the app processes items of importance in the environment and then shares information it considers to be relevant. Once the app is open, users can select a mode that best describes the environment they're now in, whether that's being home, at work, in a shopping mall, or in a situation where they need to have text read aloud to them ("scan mode"). The app gives spoken notification to minimize interaction and allow users to continue their activity without checking the phone. In a similar approach, Microsoft created Soundscape to help people navigate cities by providing audio cues and labels. We've seen a few apps that are specifically made for them and now it's Google's turn to launch one.
The app's Scan feature can also read text, such as a recipe from a cookbook, while its Experimental mode allows users to tinker with features in development. The Experimental mode will enable early access to users. For example, it can inform them that there is a "seat 3 o'clock", so that they do not bulge in the thing for their right. The camera has to face the surroundings.
Since Lookout relies on machine learning, the more you use it, the better results it can give you.