Unlike MotionSavvy, which is essentially just a tablet, this prototype consists of a glove with sensors and accelerometer attached to a “control section”, and lcd screen with audio. The sensors on the glove detect the way fingers are being bent and the accelerometer detects the tilting and movement of hand. Once those gestures are captured by the glove, they are sent to the control section which has a database of sign language gestures. The captured movements are matched against the stored database gestures and the pre-recorded audio equivalent is spoken and text for those signs is displayed on the screen.
There is no news on when this device will hit the market and how much it will cost, but the good thing about this is that students everywhere are taking initiative to change the way we interact with people with impairments and disabilities! It is nice to see a shift in the way we are thinking about bridging all sorts of communication gaps and overcoming limitations!
[Thanks for sharing, Mayank!]