Tongue Interface Using Kinect

Researchers at The University of Electro Communications are busy developing an interface that detects tongue movements and allows users to access items using their tongues. This interface is being designed for people who have difficulty speaking and swallowing (stroke victims, for example).

The interface uses the Kinect and works by first detecting the location of the two eyes. Once it has located the eyes, it estimates the location of the tip of the nose. Once that is done, it estimates the location of the mouth area, and based on that, the actual movement of the tongue is obtained.

This interface could be efficiently used if the user could train their tongue. One way to do so is to move the tongue left and right. The researchers have actually created a game that uses the left to right tongue movement.

The interface is still a work in progress  – quite raw and not very robust. The researchers are working on improving the interface’s ability to detect tongue movements more precisely. Also, they plan to include detection of lip movements in the future along with tongue movements. However, this is another fine example of what technology and innovation can do to make lives better for everyone.

Watch the video to see this interface (and the game) in action.

Source: DigInfo via Engadget


  1. That is very clever!:O

Leave a comment

Your email address will not be published.


This site uses Akismet to reduce spam. Learn how your comment data is processed.