This Hidden Setting Lets You Control Your iPhone with Your Eyes

The feature is buried within your iPhone’s Settings app.

Eye Tracking on iOS 18Photo by Tucker Bowe for Gear Patrol

It hasn’t been nearly as hyped as other standout iOS 18 features — such as a completely customizable Home Screen, an overhauled Photos app, and the first Apple Intelligence features — but the fall software update gave your iPhone a sneaky cool feature.

It’s called Eye Tracking and, as its name gives away, it allows you to control your iPhone with your eyes.

Eye Tracking on iOS 18
Eye Tracking can be accessed via your iPhone’s Accessibility settings.
Photo by Tucker Bowe for Gear Patrol

Eye phone control

Eye Tracking is a new accessibility feature that, when enabled, uses your iPhone’s front-facing camera to track your eye movements. You can see where you are looking thanks to an onscreen pointer that appears.

With Eye Tracking, you can scroll and swipe by looking at your iPhone screen. You can also perform actions (like selecting an option or opening an app) by “dwelling” or holding your gaze.

As an accessibility feature, Eye Tracking is for users with physical disabilities and thus isn’t designed for everybody. But you can still use touch controls when enabled, and it is neat to try out.

Eye Tracking on iOS 18
Setting up Eye Tracking takes less than a minute.
Photo by Tucker Bowe for Gear Patrol

How to enable Eye Tracking

  1. Open the Settings app on your iPhone.
  2. Select Accessibility.
  3. Scroll down and toggle on Eye Tracking.

Once enabled, you’re iPhone will immediately jump into a setup process for calibrating your eyes — it takes a minute and has stare a different colored dots around your iPhone screen.

One of the neat things about Eye Tracking is that it is fairly customizable. Once enabled, you can adjust settings such the smoothness of the pointer’s responsiveness or the amount of time your eyes need to “dwell” on an option to click it.

Eye Tracking on iOS 18
You can customize various settings for Eye Tracking once enabled.
Photo by Tucker Bowe for Gear Patrol

Eye Tracking is just one of the new accessibility features that Apple rolled out with iOS 18. Other neat features include Music Haptics, for users who are deaf (or hard of hearing) to experience music, and Vocal Shortcuts, so you can get your Siri to learn to custom phrases and perform different actions.

In order to use Eye Tracking, you need to have to have an iPhone 12 (and 3rd-gen SE) or later that supports iOS 18.

You can visit Apple’s support page to learn more about Eye Tracking.