There are many things an able bodied person can do very easily on their phones. Want to go somewhere? Open the lyft app and get a ride. Cooking dinner? Look up a recipe online. However, for the 630 million people in the world who live with cognitive disabilities, what seem like “simple” tasks are not that simple.
In order to make such tasks easy for people with cognitive disabilities, Google is working on an experimental feature that uses a combination of Android phones and Google Assistant (Google’s Alexa). “Action Blocks” allow a caregiver to add a Google Assistant Command to a user’s phone’s home screen and associate an image with it. The image acts as a visual cue so the user understands what action will be performed on pressing that image.
As you can see in the image below, you can add a command titled “Bedtime story” which asks Google Assistant the command “Tell me a bedtime story”. Assistant then starts a story for the user. This “Action Block can then be associated with an image and placed on the home screen. Any time the user wants to hear a bed time story, they can press that image. Action Blocks can be created for practically any question Google Assistant can respond to like “What’s the weather?”, “turn on TV”, “Turn off all lights”, etc. Each of these actions can have images that look like the sun, TV and light bulbs respectively to provide visual cues and make it easy for the person to use them.
Currently, Action Blocks is still in testing phase. If you are a caregiver or family member of a person with cognitive disabilities who you think can benefit from Action Blocks, join Google’s trusted tester program.
Source: Google