Apple users will soon be able to control functions on their devices with eye movements following Apple’s announcement to enable eye tracking on the iPhone and iPad as part of updates in the iOS18.
According to Apple, the feature is among accessibility tools aimed at assisting persons with physical disabilities by using artificial intelligence to more easily navigate their devices using their eyes.
The new feature is joined by a new Music Haptics tool, which uses the haptic engine in the iPhone – which powers the vibrations on the device – to enable those who are deaf or hard of hearing to experience music vibration via the audio of the music, Verge reports.
The eye-tracking tool uses the front-facing camera on the iPhone or iPad to set up and calibrate, and does not require any additional hardware or software, Apple said.
“We believe deeply in the transformative power of innovation to enrich lives,” Apple Chief Executive, Tim Cook said.
“That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. “We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”