Apple will launch accessibility features like eye tracking, music haptics and vocal shortcuts later this year. The AI- powered updates are expected to be part of the iPhone-makers updates for iOS 18 and aim to improve accessibility on the company’s devices.
Eye tracking for iPad and iPhones, designed for users with physical disabilities, will make use of the front-facing camera to navigate elements on an app along with functions such as physical buttons, swipes, and other gestures.
Apple says the features will make use of on-device machine learning, data for which will be stored on the device, and will not be shared with the company.
Music haptics is designed for users who have hearing impairments. The feature will use the iPhone’s taptic engine to play taps, textures and refined vibrations to the audio of the music.
(For top technology news of the day, subscribe to our tech newsletter Today’s Cache)
Apple announced music haptics will be available as an API for developers to make music more accessible in their apps.
Vocal shortcuts on iOS 18 will be available on the iPhone and iPads and will allow users to assign custom utterances to raise Siri to launch shortcuts and complex tasks.
The company also announced that future updates will include a feature that will allow its devices to listen for and understand atypical speech. The feature will use on-device machine learning to listen for atypical speech for “enhancing speech recognition for a wider range of speech”.
Other upcoming features announced by the company include vehicle motion cues, which can help reduce motion sickness for passengers in moving vehicles when using the iPhone or iPads. Enhanced voice controls for CarPlay along with colour filters and sound recognition.
The company also announced improvements to its VoiceOver, which will include new voices, enhancements to its Braille Screen input, and hover typing for users with low vision.
Published - May 17, 2024 12:32 pm IST