New Apple Accessibility Features: Control iPhone and iPad Your Eyes

accessibility eyes

Apple has announced new accessibility features for iPad, iPhone, and visionOS, set to launch later this year.

The updates include Eye Tracking, Music Haptics, Vocal Shortcuts, Vehicle Motion Cues, and more. These features aim to enhance the user experience for individuals with disabilities, utilizing Apple hardware, artificial intelligence, and machine learning.

“We believe deeply in the transformative power of innovation to enrich lives,” said Apple CEO Tim Cook in a statement on Wednesday. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

Apple’s press release was jam-packed with updates on these upcoming accessibility features, with some of the more interesting ones allowing users to control iPhone and iPad with their eyes.

Check out an overview of the new upcoming accessibility features below:

Eye Tracking for iPad and iPhone

Eye Tracking enables users with physical disabilities to control their devices using their eyes. This feature utilizes the front-facing camera and on-device machine learning to navigate iPadOS and iOS apps without additional hardware.

Music Haptics

Music Haptics allows users who are deaf or hard of hearing to experience music through vibrations, using the Taptic Engine in iPhone. This feature works with Apple Music and is available as an API for developers to integrate into their apps.

Vocal Shortcuts

Vocal Shortcuts enable users to perform tasks with custom sounds. Enhanced speech recognition for atypical speech, powered by machine learning, benefits users with conditions such as ALS or cerebral palsy.

Vehicle Motion Cues

Vehicle Motion Cues aim to reduce motion sickness for users using iPhone or iPad in moving vehicles. Animated dots on the screen represent vehicle motion, reducing sensory conflict, and can be toggled on or off in Control Center.

visionOS Accessibility Features for Vision Pro

Updates to visionOS include Live Captions for FaceTime and other vision accessibility features. The updates also add support for Made for iPhone hearing devices and cochlear implants.

Additional Updates

  • VoiceOver: New voices, custom volume control, and keyboard shortcuts on Mac.
  • Magnifier: New Reader Mode and easy launch for Detection Mode.
  • Braille: Enhancements for Braille Screen Input, including Japanese language availability and support for multi-line braille.
  • Hover Typing: Shows larger text in preferred font and color.
  • Personal Voice: Available in Mandarin Chinese, with new creation methods for users with speech difficulties.
  • Live Speech: Categories and simultaneous compatibility with Live Captions.
  • Virtual Trackpad for AssistiveTouch: Resizable trackpad for better control.
  • Switch Control: Recognizes finger-tap gestures using device cameras.
  • Voice Control: Support for custom vocabularies and complex words.

Apple’s accessibility updates are great, but look pale in comparison to the AI updates shared this week by OpenAI for ChatGPT and yesterday’s Google Gemini AI updates. For those with vision impairments, generative AI from both OpenAI and Google will be able to explain the world around you by using your device’s camera.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.