GazeSpeak: Microsoft App That Lets People With ALS Communicate With Eyes

Image shows a man winking at a phone held by another person right in front of him. At the bottom left is a close up of the back of the phone that this man is looking at. It has the sticker that has the layout similar to that of an e-train board.

For people with motor neuron diseases like ALS, eye movement may become the only way to communicate with others. Of course, there are many eye tracking solutions available that can be used in such situations, however, those solution can be quite expensive and not many people can afford it. They also need frequent calibrations, and are not necessary always accurate, especially in sunlight.

an interpreter displaying the e-tran board to a person in a wheelchair.

an interpreter displaying the e-tran board to a person in a wheelchair.

A cheaper alternative is a physical e-tran communication board (also called eye gaze board) which is a transparent board with groups of letters and colors. In order to communicate words, the person with ALS indicates with their eyes which letter they want, and the interpreter holding the board writes those letters/words down. This method, although low cost, can be quite slow, require patience, and not easy to master.

Keeping these two available solutions in mind, and wanting to make this technique of using eye movement and gesture for communicating not only efficient but also much affordable, Microsoft researchers have created a new low cost app called GazeSpeak, which is an eye gesture communication based on the e-tran board concept, but is much more portable, robust, and easy to learn with quicker communication. GazeSpeak uses a sticker on the back of the phone that is held by the interpreter. This sticker has the same layout as the e-tran board, and the person with ALS moves their eyes in the direction where the right letter is listed. The app detects the eye movement and on the phone interface, provides intelligent suggestions on what the word might be to the interpreter. The phone interface also provides the interpreter the ability to type letters. The entire string of communication is displayed on the app, and if there’s a letter that needs to be corrected, the person just blinks one of their eyes to indicate a correction. Blinking of the other eye indicates completion of a word.

GazeSpeak can also use the front facing camera of a phone which is helpful for someone to communicate when their phone is mounted on their wheelchair.

This app will be presented in May at the Conference in Human Factors in Computing Systems in Denver,CO.

Watch this video to learn how GazeSpeak works:

Image Gallery:

 

Source: Microsoft, The Telegraph

E-tran board image source: Bridges-Canada.com

Additional Information:

Watch this video to see how a traditional e-tran board is used for communicating with a person with motor neuron disease.

5 Comments

  1. Melissa Oliver, MS OTR/L February 28, 2017 at 6:16 am

    When will this app be available for purchase? Is it Android devices only?

    • This is still a proof of concept. It’s possible that it will be available to everyone after they present it at the Conference in Human Factors in May. Knowing what I know about Microsoft apps, it may be available for all platforms.

  2. I need this for several students

  3. Communication would be quicker if you choose colour followed by the group containing the letter. This would save the assistant having to look a second time at the original group.

1 Trackbacks & Pingbacks

  1. GazeSpeak: Microsoft App That Lets People With ALS Communicate With Eyes – Information Technology

Leave a comment

Your email address will not be published.


*