GazeSpeak: Microsoft App That Lets People With ALS Communicate With Eyes
For people with motor neuron diseases like ALS, eye movement may become the only way to communicate with others. Of course, there are many eye tracking solutions available that can be used in such situations, however, those solution can be quite expensive and not many people can afford it. They also need frequent calibrations, and are not necessary always accurate, especially in sunlight.
A cheaper alternative is a physical e-tran communication board (also called eye gaze board) which is a transparent board with groups of letters and colors. In order to communicate words, the person with ALS indicates with their eyes which letter they want, and the interpreter holding the board writes those letters/words down. This method, although low cost, can be quite slow, require patience, and not easy to master.
Keeping these two available solutions in mind, and wanting to make this technique of using eye movement and gesture for communicating not only efficient but also much affordable, Microsoft researchers have created a new low cost app called GazeSpeak, which is an eye gesture communication based on the e-tran board concept, but is much more portable, robust, and easy to learn with quicker communication. GazeSpeak uses a sticker on the back of the phone that is held by the interpreter. This sticker has the same layout as the e-tran board, and the person with ALS moves their eyes in the direction where the right letter is listed. The app detects the eye movement and on the phone interface, provides intelligent suggestions on what the word might be to the interpreter. The phone interface also provides the interpreter the ability to type letters. The entire string of communication is displayed on the app, and if there’s a letter that needs to be corrected, the person just blinks one of their eyes to indicate a correction. Blinking of the other eye indicates completion of a word.
GazeSpeak can also use the front facing camera of a phone which is helpful for someone to communicate when their phone is mounted on their wheelchair.
This app will be presented in May at the Conference in Human Factors in Computing Systems in Denver,CO.
Watch this video to learn how GazeSpeak works:
Image Gallery:
Source: Microsoft, The Telegraph
E-tran board image source: Bridges-Canada.com
Additional Information:
Watch this video to see how a traditional e-tran board is used for communicating with a person with motor neuron disease.
When will this app be available for purchase? Is it Android devices only?
This is still a proof of concept. It’s possible that it will be available to everyone after they present it at the Conference in Human Factors in May. Knowing what I know about Microsoft apps, it may be available for all platforms.
I need this for several students
Hi Tim! I would say keep an eye on the Conference in Human Factors in Human Systems to be held in May. That is where this team is presenting this app, and hopefully they will have some news on when this app will be available to the public.
https://chi2017.acm.org/index.html
Communication would be quicker if you choose colour followed by the group containing the letter. This would save the assistant having to look a second time at the original group.
Is this app available yet? I have a friend who really needs it
Could you please contact me about using my copyrighted images? Thanks.
Hello Margaret,
Sent you an email last night.