>> Apple/unsplash
Apple has announced Eye Tracking technology, which should allow users to control the iPhone and iPad using their eyes. This will be useful for people with disabilities.
Apple has announced a new feature for people with disabilities that will appear in iOS 18, iPadOS 18 and visionOS 2. The feature is called Eye Tracking.
It is based on artificial intelligence and will allow you to control iPhone and iPad with eye movements. The feature uses the front camera for eye tracking.
“Every year we break new ground on accessibility,” said Sarah Herlinger, Apple's senior director of global accessibility policy and initiatives. – These new features will impact the lives of a wide range of users by providing new ways to communicate, control their devices and navigate the world.
All data for Eye Tracking is stored on the device. No additional equipment is required for the function to work. It will work in all applications on iOS and iPadOS.
You can assign your own statement/Apple
Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are very excited that Apple is making these new accessibility features available to consumers, said Mark Hasegawa-Johnson, Accessibility Project broadcasting at the Beckman Institute of Advanced Specialists.
Another feature is Music Haptics. It is a new way of listening to music for deaf or hard of hearing people. The Taptic Engine on the iPhone reproduces touches, textures and vibrations to the sound of music. Both features promise to launch this year.
Sound recognition function/Apple