How do blind people know what’s going on around them? Are people interested in their conversations? Are they giving them full attention? What sort of activities are people taking part in around them while they are, let’s say, sitting on a park bench?
Wouldn’t it be nice if there could be a device that could describe the people, their emotions, and behavior around them?
Saqib Shaikh is a Microsoft developer who lost his vision when he was 7. Being a developer and
knowing what technology, especially newer Microsoft technology, can do, is working on a project called Seeing AI that can provide descriptions of surrounding objects and people to a blind person. The project uses Microsoft Cognitive Services, a set of APIs (tools for building software applications) that can allow devices and apps to see, hear, speak, understand emotions and needs via various devices, apps and platforms.
Saqib is seen saying that artificial intelligence is improving at a very fast rate, and what we see in Seeing AI is just the beginning. It’s exciting to see what’s the future has in store for us!
Watch the video to see how Seeing AI works and what all it can do.
[Thank you for sharing, Prady!]