An illustration shows a phone screen with a mustard jar on the left and a document on the right, with a speech bubble emerging from the mustard jar to illustrate the app's ability to identify it. The speech bubble reads:

Google Lookout: App reads food labels for the blind

Image rights
Google

Google's AI can now identify groceries in the supermarket to help the visually impaired.

It's part of Google's Lookout app, which is designed to help people with or without visual impairments identify things around them.

A new update has added the ability for a computer voice to say out loud what food a person is holding based on their visual appearance.

A UK blindness charity welcomed the move, saying it could help strengthen the independence of the blind.

Google says the feature "will be able to differentiate between a can of corn and a can of green beans".

Striking, not easy

Many apps, such as B. Calorie trackers, have long used product barcodes to identify what you are eating. According to Google, Lookout also uses image recognition to identify the product by its packaging.

The app for Android phones contains around two million "popular products" in a database that is stored on the phone. This catalog changes depending on where the user is in the world, according to a post on Google's AI blog.

In a kitchen cabinet test by a BBC reporter, the app had no trouble identifying a popular American hot sauce brand or similar product from Thailand. It also correctly read spices, jars and cans from UK supermarkets as well as imported Australian favorite vegemites.

However, it did less well with fresh produce or containers with irregular shapes like onions, potatoes, tubes of tomato paste, and sacks of flour.

If there were problems, the app's voice would prompt the user to rotate the package to a different angle – but it still failed on several items.

  • A blind inventor and a suitcase that "sees"
  • Defeated Domino & # 39; s Pizza in Court for App

The UK's Royal National Institute of Blind People (RNIB) cautiously welcomed the new role.

"Food labels can be challenging for people with visual impairments because they are often more noticeable than easy to read," said Robin Spinks of the charity.

"Ideally, we would like the accessibility of labels to be built into the design process to make them easier for the visually impaired to navigate."

Along with other similar apps like Be My Eyes and NaviLens, which are also available on iPhones, it can "help strengthen the independence of people with vision loss by quickly and easily identifying products".

Media playback is not supported on your device

Media signatureBe my eyes: How smartphones became “eyes” for the blind

Lookout uses a technology similar to Google Lens, the app that can see what a smartphone camera is looking at and show the user more information. There was already a mode in which any text it pointed to could be read and an "exploration mode" in which objects and text were identified.

When Google launched the app last year, it recommended putting a smartphone in a shirt pocket or lanyard around your neck so the camera can see things right in front of it.

Another new feature added in the update is a document scanning feature, which takes a photo of letters and other documents and sends it to a screen reader for reading.

Google also says it has made improvements to the app based on feedback from visually impaired users.