Google’s amazing new app lets you speak using just your eyes

Google's Look to Speak app
YouTube.com/Google

While we wait for the Babel fish translator from “The Hitchhiker’s Guide to the Galaxy,” Google has been working on an entirely new way for people to communicate.

The company’s latest experimental app is geared toward those with speech and motor impairments. It enables users to select a word or phrase — with just their eyes — that is then spoken aloud.

Yes, tech that allows you to control your phone with just your eyes is here. Let’s jump into how it works and how you or a loved on can start using this app.

What is Look to Speak?

Look to Speak, available now in the Google Play store, allows you to communicate by simply looking at words and phrases displayed on the screen.

The app uses your front-facing camera to track the movement of your eyes. It displays pre-written phrases divided into two vertical lists. Looking off-screen to the left or the right will select one of the lists; to cancel a selection or to snooze, simply look up over the screen.

After each selection, the number of words and phrases narrows down, and your phone will read the remaining one out loud. You’re not limited to Google’s pre-written phrases, either. You can add your own by tapping on the menu icon in the top-right corner.

The app is available to everyone and compatible with Android 9.0 and later.

RELATED: 3 apps that will make your Android run smoother and faster.

‘Now conversations can happen more easily’

Look To Speak was developed with the help of artist Sarah Ezekiel. Diagnosed with motor neuron disease in 2020, she’s one of millions of people living with speech and motor impairments.

Ezekiel worked with Google speech and language therapist Richard Cave to find ways machine learning could help people living with similar impairments. Cave specializes in working with people who are non-verbal and require assistance to communicate.

“It’s more than a job for me, it’s a passion,” he said on the Google blog. “Every day, I strive to help people find easier and more accessible ways to express their everyday needs, opinions, feelings and identity.”

The hope is to help people with temporary, permanent or situational disabilities — especially in situations where other more cumbersome communication devices can’t go. Think outside, in the shower or in urgent situations, Cave said.

“Now conversations can more easily happen where before there might have been silence, and I’m excited to hear some of them.”

Let LiDAR be your eyes

Advances in tech can be life-changing for people with disabilities. Take Be My Eyes, an app that lets users “lend their eyes” to those with visual impairments. Try it out if you have some free time.

Apple recently launched several apps to help those with visual impairments, like People Detection and an update to its on-screen magnifier.

Making use of the iPhone 12’s LiDAR sensor, it provides visually-impaired people a way to “see” what’s around them. It can detect objects and other people, which is especially useful in crowded places or on city sidewalks.

The technology be integrated with AirPods and Apple’s screen-reader technology, VoiceOver.

RELATED: Back Tap is the best new iPhone features you’re not using yet.

Google’s Android Accessibility Suite features a large on-screen menu (making options easier to see and select); the ability to have options read aloud; and can replace the touchscreen with a keyboard.

With the TalkBack screen reader, you can control your device with gestures. There’s even an on-screen braille keyboard.

Tags: Android, Apple, Apple iPhone, communication, disabilities, Google, Google Play Store, machine learning