Eye control technology might sound like something from a sci-fi movie. But soon millions of iPhone users will be able to control their phones with their eyes.
Apple has confirmed that eye tracking is now available on iPads and iPhones. This new tool uses artificial intelligence (AI) to let users manage their Apple devices using just their eyes.
Apple says “Eye Tracking uses the front-facing camera to set up and calibrate in seconds. With on-device machine learning, all data used to set up and control this feature is kept securely on the device and isn’t shared with Apple.”
Last week Apple introduced several new accessibility features including the eye-tracking tool. Apple’s CEO said “We believe deeply in the transformative power of innovation to enrich lives.” Apple has focused on inclusive design for nearly 40 years by integrating accessibility into its hardware and software.
“We are continuously pushing the boundaries of technology and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users,” the CEO added.
You won’t need extra hardware or accessories to use Eye Tracking with iPadOS and iOS apps. Once set up users can use their eyes to control different features like physical buttons and swipes using something called Dwell Control.
Although the new feature won’t be available until later this year it has already attracted a lot of interest on social media. One user tweeted “For those with specific disabilities, this is fantastic news; however, for everyone else it simply means becoming more lazy than we already are.” Another said “The Black Mirror episodes making more sense every day.” And one joked, “Oh, this generation is about to be the laziest generation ever.”
Apple also introduced a feature that could help reduce motion sickness in passengers in moving cars. Apple says motion sickness is often caused by a conflict between what a person sees and what they feel. The new Vehicle Motion Cues feature uses animated dots on the screen’s edges to show changes in vehicle motion which can help reduce motion sickness.
Another new feature is Music Haptics which uses the iPhone’s vibration engine to let deaf or hard-of-hearing users feel the vibrations of the music.
Additionally Apple announced new speech features for customers with speech-impairing conditions. These features will allow users to program specific phrases for Siri, Apple’s virtual assistant, to help with shortcuts to apps.