New Delhi:
Apple today revealed a range of new accessibility features set to roll out later this year. These include Eye Tracking, Music Haptics, Vocal Shortcuts and Vehicle Motion Cues. More accessibility upgrades are planned for visionOS.
What is Apple’s new Eye Tracking feature?
Eye Tracking is powered by artificial intelligence and provides a built-in way for users to control iPads and iPhones using just their eyes. This feature is specifically designed to help users with physical disabilities, allowing them to interact with their devices more easily.
How it works
-
The front-facing camera is used to set up and calibrate Eye Tracking in just a few seconds.
-
Your privacy is protected because all the data used for Eye Tracking stays on your device. It is not even shared with Apple.
-
Eye Tracking works with all apps on iPadOS and iOS so no extra hardware or accessories are needed.
Using Eye Tracking, users can
-
Navigate through different parts of an app by looking at them.
-
Use Dwell Control to activate these elements. This means that they can select a part of the screen by focusing their gaze on that part of the screen for a short period.
-
Eye Tracking allows users to perform actions such as pressing buttons, swiping, and other gestures using only their eyes, without needing physical touch.
Apple CEO, Tim Cook, said they believed that innovation enriches lives and “that’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software.” He stated that Apple was “continuously pushing the boundaries of technology” and the new features reflect their “long-standing commitment to delivering the best possible experience to all of our users.”
Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives said these new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”
Waiting for response to load…