Google “Look to Speak”: When eyes learn to speak

With a new Android app, Google wants to help people with motor and speech impairments to communicate with other people. The clou: all you need is an Android smartphone without any further accessories.

Accessibility features on current smartphones or tablets make it easier for many people to live their everyday life. In the current example, Google demonstrates how “Look to Speak” can help people with motor and speech impairments.

Richard Cave, Speech & Language Therapist at Google, describes how a small group of people in the company has experimented with the capabilities of current smartphones. The app, called “Look to Speak”, allows users to select predefined sentences with the help of their eyes, which are then spoken by the mobile phone. The app can be downloaded from the Play Store and is compatible with Android 9.0 or later. Devices with Android One are also supported.

“Look to Speak”: Voice control with the eye

With “Look to Speak”, voice control is achieved solely through eye movements. The app interface is divided into three areas. Depending on whether you move your eyes to the left, right or up, corresponding phrase groups are selected, which are refined bit by bit until you reach the desired phrase. A look up allows you to jump to the beginning of the selection.

Google allows users to customize the phrases as they wish, so that they can be more closely tailored to their personal requirements. However, this personalization as well as the fine-tuning of the gaze cannot be achieved via this eye control. Here, Google continues to use traditional smartphone control methods.

Google experiments with “Start with One”

“Look to Speak” is a result of Google’s “Start with One” initiative, which itself is part of a parent initiative called “Experiments with Google”. These projects start – as the name suggests – with the idea that they could initially help just one person and only later benefit the larger community. One of these applications is the “Teachable Machine” app that allows the training of a machine learning model via a web interface without the need to know how to program.

Richard Cave said that it was fascinating to see how “Look to Speak” could be used in situations where other previous solutions couldn’t easily go. As an example, he cites the use outdoors, in transit, in the shower or in urgent situations. “Now conversations can more easily happen where before there might have been silence, and I’m excited to hear some of them,” says Cave.

Source:
Google, Google Start with One

Never miss a story with NextPit via  Telegram 📲!


Source