This week, Apple introduced several new accessibility features set to launch later this year. Among these, Eye Tracking stands out, offering users with physical disabilities a way to navigate iPads and iPhones using only their eyes. This feature uses the front-facing camera and on-device machine learning to set up and calibrate quickly, ensuring all data remains secure on the device.
Additionally, Apple is introducing Music Haptics, which allows deaf or hard-of-hearing users to experience music through the Taptic Engine in iPhones.
Vocal Shortcuts will enable users to perform tasks with custom sounds, while the Vehicle Motion Cues feature aims to reduce motion sickness for iPhone and iPad users in moving vehicles. VisionOS will also receive new accessibility features.
These innovations leverage Apple hardware and software, utilizing Apple silicon, artificial intelligence, and machine learning to advance Apple’s commitment to inclusive design.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “For nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
Apple announced new accessibility features for later this year, including Eye Tracking, enabling users with physical disabilities to control iPads and iPhones with their eyes. pic.twitter.com/m39O67tvtw
— AppleMagazine (@AppleMagazine) May 17, 2024
“Each year, we break new ground when it comes to accessibility,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”
Eye Tracking, powered by artificial intelligence, offers a built-in option for users with physical disabilities to control their devices using their eyes. The feature works across iPadOS and iOS apps without requiring additional hardware. Users can navigate app elements and use Dwell Control to activate functions like physical buttons and gestures solely with their eyes.