Igeekphone June 11 news, Apple in the latest release of iOS 18 operating system, officially introduced the eye tracking function.
The technology uses a built-in front-facing camera to capture the user’s eye movements and calibrates through machine-learning on the device, allowing users with disabilities to control iphones and ipads using only their eyes.
Eye tracking is designed to be a more user-friendly experience for people with disabilities, allowing users to set up and calibrate within seconds, navigate app elements with their eyes, activate them with stay controls, and even access physical buttons, swipes, and other gestures.
It is worth noting that all Settings and control data are securely stored on the device and are not shared with Apple, fully protecting user privacy.
And eye tracking in iOS 18 doesn’t require any additional hardware or accessories to work smoothly in iPadOS and iOS apps.
In addition to eye tracking, Apple has also introduced accessibility features such as music touch and voice shortcuts to further enhance support for users with disabilities.
The Music Touch feature gives deaf or hearing-impaired users a new way to experience music through the iPhone’s touch engine, while voice shortcuts allow users to perform tasks with custom sounds.