Apple products for everyone. Last week, ahead of Global Accessibility Awareness Day on May 16, Apple announced a host of accessibility features ready to launch later this year in iOS 18. This includes eye tracking, new music haptics, vocal shortcuts, and sensory conflict relief. VisionOS and Apple CarPlay will also be getting minor upgrades.
New features, who dis? Eye tracking uses AI to provide people with physical disabilities hands-free navigation on iOS and iPadOS. Apple reports that this feature will use the front-facing camera to track the user’s eye movements, powered by on-device machine learning without needing any hardware or accessory updates. This news comes hot on the heels of Apple’s agreement with OpenAI for ChatGPT to power the phone-maker’s AI features.
Reexperiencing music. Apple’s new music haptics option will utilize the phone’s Taptic Engine to produce “refined vibrations” in rhythm with music tracks, adding taps and textures to the audio, to make music accessible to people who are hearing impaired. This feature will initially only be available for tracks supported by Apple Music, but the company has said that app developers like Spotify will be able to integrate the feature through an API.
Siri is about to become a better listener, too. The AI assistant will now be listening for “atypical speech” to recognize the user’s speech patterns, which may assist people with conditions that affect speech, like cerebral palsy or ALS. This comes alongside vocal shortcuts that users can customize, designating a term or phrase that instructs Siri to complete a complex task.
Great news for those who travel poorly: Apple wants to help alleviate your motion sickness. If you feel nauseous and dizzy using your devices in a moving vehicle, Apple may have a solution. The feature will place animated dots along the edges of your screen, which mimic the motion of the vehicle to reduce sensory conflict.