In the world of Apple’s latest iOS 18 update, the focus has been on the exciting new features coming with the Apple Intelligence suite. However, let’s not overlook the new accessibility features that are also making their debut and deserve some attention.
While the AI functions are limited to specific iPhone models for now, such as the iPhone 15 Pro and iPhone Pro Max, the new eye tracking accessibility option will be available on a broader range of devices, including every iPhone 12 and above, as well as the budget-friendly iPhone SE 3.
Apple announced a suite of accessibility features earlier this year, promising their arrival “later this year.” Now officially launching with iOS 18, these features include Eye Tracking, Music Haptics, and Vocal Shortcuts.
Eye Tracking in iOS 18 is a groundbreaking accessibility feature that uses artificial intelligence to enable users with motor function impairments to navigate their iPhones and iPads using only their eyes. This feature utilizes the front-facing camera for setup and calibration in seconds, ensuring all data remains securely stored on the device.
Users can seamlessly move between screen elements with their eyes, tap screen buttons using Dwell Control to activate them or swipe up and down solely through eye movements. While Dwell Control may have some bugs in its current state, Apple is continuously improving it through beta testing for a smoother experience upon release.
Aside from Eye Tracking, iOS 18 introduces Music Haptics that adds taps and vibrations to audio available on Apple Music. Additionally, Vocal Shortcuts expand voice command capabilities for users who have difficulty speaking for various reasons.
Overall, these new accessibility features aim to make iPhones more inclusive and user-friendly for individuals with different needs. Stay tuned for more updates as iOS 18 continues to evolve!