Apple Announces New Accessibility Features: Eye Tracking, Music Haptics, and More

The image shows the CarPlay interface. Various app icons such as Phone, Music, Maps, Messages, and Now Playing are visible on the screen. At the bottom of the screen, a notification banner with the title

Major Updates

  • Eye Tracking: Control iPad or iPhone with eyes.
  • Music Haptics: Experience music through vibrations.
  • Vocal Shortcuts: Perform tasks with custom sounds.
  • Vehicle Motion Cues: Reduce motion sickness in vehicles.
  • Accessibility Features for CarPlay and visionOS: Enhanced accessibility across devices.

Apple has announced new accessibility features to be released later this year, designed to improve the user experience for people with various disabilities. These features include Eye Tracking, allowing users to control iPads and iPhones with their eyes, making navigation easier for those with physical disabilities. This technology uses the front-facing camera and on-device machine learning to ensure privacy and security. Additionally, for deaf and hard of hearing music lovers, Music Haptics will enable them to experience music through the iPhone’s Taptic Engine, providing tactile feedback synchronized with the audio.

Visual below shows a person using eye tracking on her iPad.

Visual below has an animated concentric rectangles in a loop indicating tactile feedback synced to the music.

Another significant update is Vocal Shortcuts, which lets users assign custom sounds to Siri commands, facilitating task completion through unique vocalizations. This feature, alongside Listen for Atypical Speech, enhances speech recognition for individuals with conditions like cerebral palsy or ALS, improving device control for a broader range of speech patterns. Apple is also introducing Vehicle Motion Cues, which helps reduce motion sickness for passengers using iPhones or iPads in moving vehicles by displaying animated dots that align with vehicle motion.

Visual below shows animated dots move to the left or right of the Apple device as the vehicle makes a left or right turn.

CarPlay and visionOS will also receive accessibility updates. CarPlay will feature Voice Control, Color Filters, and Sound Recognition, aiding users who are deaf or hard of hearing and those with color blindness. VisionOS will include Live Captions for better communication during live conversations and audio from apps, along with support for Made for iPhone hearing devices. These enhancements reflect Apple’s ongoing commitment to inclusive design, leveraging advanced technology to make their products more accessible to everyone.

These new accessibility features continue to underscore Apple’s dedication to inclusive design, demonstrating a commitment to leveraging advanced technology to enhance the lives of all users. Announced in conjunction with Global Accessibility Awareness Day, these updates highlight Apple’s ongoing efforts to make accessibility a priority. By integrating innovative solutions like Eye Tracking, Music Haptics, and Vocal Shortcuts, Apple continues to break new ground in creating products that cater to the diverse needs of its global user base. This focus on accessibility not only enriches the user experience but also sets a benchmark for the industry, emphasizing the importance of designing technology that is truly for everyone.

Source: Apple

ChatGPT, a potential tool for increased accessibility, was used as a research and writing aid for this blog post. Do you think this is an appropriate use of chatGPT? Why or why not? Let me know!

Leave a comment

Your email address will not be published.


*


This site uses Akismet to reduce spam. Learn how your comment data is processed.