Four Reasons Augmented Reality Will Change Assistive Technology

augmented reality furniture displayed on various electronic devices

Augmented reality for navigation is a hot topic in assistive technology.

 In our analysis, we found that more than 600 articles on AR and assistive technology have been published since 2000, many within the past 5 years.

 Why is that? What’s the big deal with AR, anyway?

 In this article, we’ll take a look at what augmented reality is, as well as how we feel like it will change assistive technologies.

What Is Augmented Reality?

 Augmented reality is an extension of computer vision, which is the ability of computers to make sense out of images.

 AR is typically thought of as computer-generated assets placed over the real world using some type of display, including wearables.

 However, augmented reality actually includes whatever useful visual perception a person might have, supplemented with additional computer-generated sensory information to help us better understand our world.

 When we think of how to use augmented reality with assistive technologies, we usually think of navigation.

 Some promising techniques for providing computer vision that can be used for navigation include:

 1.     Tracking and probabilistic inference of position

2.     Object and face detection algorithms

3.     Depth calculations using a variety of techniques, including stereo cameras and infrared beams

4.     Optical flow calculations, including time to collision, motion detection, focus of expansion, and inertial information

5.     Use of context (sense of place from either the user, a previous user, or a knowledge map)

 Now that we know a bit more about augmented reality, how can it be used for assistive technologies?

 1. AR Will Make Assistive Technology More Cost-Efficient

 A considerable factor people look at when purchasing almost any item – whether it’s a gallon of milk, a car, or a new TV – is cost.

 Cost is especially important for assistive technology, as the World Health Organization reports that “in many low-income and middle-income countries, only 5-15% of people who require assistive devices and technologies have access to them.”

 However, smartphone ownership is rising among these populations, according to Pew.

 With a growing market for smartphone ownership, it may make sense for more assistive technologies to be available for smartphones, tablets, and other smart devices.

 In our interviews with people with low to no vision, cost was a prohibitive factor for purchasing assistive technology, even with funding assistance. One even said he liked the idea of an app being available on a smartphone because he said it would be a fraction of the cost of another piece of assistive hardware.

 The biggest benefit of using AR on smartphones for assistive technology is that AR is just scratching the surface of what those devices are capable of.

2. AR Is One Of Many Smart Device Functions

 Smartphones and tablets have more than just one function, unlike other assistive technology devices. Smartphones, in particular, allow users to place and receive phone calls, text messages, play games, listen to music, participate on social media, and a myriad of other functions.

 By using a smart device capable of multiple functions, the idea of using many individual pieces of assistive technology begins to fade away.

 3. AR Spans The Spectrum Of Visual Impairments

 Augmented reality provides assistance for each level of visual impairment, not just people with only low vision or only no vision. 

 First, people who have low vision can use an app, such as Pebble HD, or a product, such as the SmartLux Digital, to help them decipher an image or text.

 Second, AR technologies for feeding additional information sources to a person while they navigate the world include augmented reality glasses, object recognition technologies, and automatic mapping engines. These can be combined to produce new apps for helping persons with a visual impairment navigate better in their environment, both indoors and outdoors.

A third approach to augmenting information for people with visual impairments is “sensory substitution” or “sensory addition.” Here, researchers use alternative sensory channels to feed information to a person with a disability as a way of augmenting their normal everyday experiences. Examples of projects using alternative sensory channels include:

●      Use of the tongue to “see” – Brainport

●      Use of hearing music to “see” – EyeMusic

●      Use of bone conduction to “see” – Orcam

●      Using skin on the back to “see” – V.E.S.T. (Versatile Extra-Sensory Transducer)

4. AR Is Useful When Canes Or Dogs Or People Lag Behind

 Canes and guide dogs are immensely useful for people with visual impairments, according to much research, as well as our own surveys. 

 They fall short in crucial areas, though.

 For example, canes can’t detect overhead hazards. According to a 2010 survey of 307 persons with a visual impairment, approximately 13% experience a head-level accident more frequently than once a month. And, 23% of those head-level accidents required some level of medical intervention.

 In addition to head injuries, a review of 31 studies found that “those with reduced visual acuity are 1.7 times more likely to have a fall and 1.9 times more likely to have multiple falls compared with fully sighted populations. The odds of a hip fracture are between 1.3 and 1.9 times greater for those with reduced visual acuity.”

 What’s Next for Augmented Reality And Assistive Technology?

 A report on the growing market for assistive technologies suggests augmented reality could play a key role.

 “Steady advances in materials and technologies associated with communications, electronics, materials, smart materials, mobile location, and sensors will stimulate robust market growth,” according to BCC Research.

 Part of that growth could be from Tango.

 Tango is a unique set of sensors developed by Google that combine to give users several features only available on Tango devices:

1.     Motion-tracking: Tango uses visual features of the environment, in combination with accelerometer and gyroscope data, to closely track the device’s movements in space.

2.     Area learning: Tango stores environment data in a map that can be used later, shared with other Tango devices, and enhanced with metadata such as notes, instructions, or points of interest.

3.     Depth perception: Tango detects distances, sizes, and surfaces in the environment.

The Lenovo Phab II Pro is currently the only available device on which to run Tango, with offerings on the roadmaps of both Huawei and ASUS.

Want To Learn More?

 One of the offerings we have at Float is Cydalion. Cydalion is a navigation app for people with visual impairments. It uses the augmented reality capabilities outlined here, and it is available now for Tango devices.

 About The Author

 Adam Bockler is the communications manager at Float, which designs and builds powerful and useful digital experiences for the enterprise. This article is a summary of many blog articles written by Float Senior Analyst Dr. Gary Woodill.

Image Source: Marketing Discussions

1 Trackbacks & Pingbacks

  1. Augmented Reality for visual needs – assisted COMMUNICATION assistée

Leave a comment

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.