close
close

I tried out the eye tracking feature in iOS 18 on my iPhone. Here’s how it works

Apple never ceases to impress its fans when it comes to accessibility features. The latest iOS 18 introduces several new accessibility features, the most prominent of which is eye tracking on iPhone. Yes, the most amazing feature of Apple Vision Pro is now also available on iPhone and iPad. With AI-powered eye tracking, iPhone users can navigate and control their devices with just their eyes. How cool is that! This feature looks impressive in theory, and we performed a thorough hands-on test to determine whether it’s ideal for everyday use. Whether you need accessibility features or just want to try out a new feature, here’s how to enable and use iPhone eye tracking in iOS 18. Let’s get started!

Devices that support eye tracking

It’s worth knowing Eye tracking is not available on all iOS 18 compatible devices. Yes, you heard right. So even if you are using the latest iOS 18, eye tracking may not be available on your iPhone. This is because Apple’s eye tracking feature is only available on iPhone 12 and newer models.

Here is the full list of devices compatible with iOS 18 eye tracking:

  • iPhone 12, iPhone 12 Mini
  • iPhone 12 Pro, iPhone 12 Pro Max
  • iPhone 13, iPhone 13 Mini
  • iPhone 13 Pro, iPhone 13 Pro Max
  • iPhone 14, iPhone 14 Plus
  • iPhone 14 Pro, iPhone 14 Pro Max
  • iPhone 15, iPhone 14 Plus
  • iPhone 15 Pro, iPhone 15 Pro Max
  • iPhone 16, iPhone 16 Plus
  • iPhone 16 Pro, iPhone 16 Pro Max
  • iPad (10th generation)
  • iPad Air (M2)
  • iPad Air (3rd generation and later)
  • iPad Pro (M4)
  • iPad Pro 12.9-inch (5th generation or later)
  • iPad Pro 11-inch (3rd generation or later)
  • iPad mini (6th generation)

Enable eye tracking in iOS 18

Turning on eye tracking in iOS 18 is quite easy. Plus, setup only takes a few minutes. Here are the steps on how to do it:

  • Open Settings app and visit Availability section.
  • Here, swipe down and select Eye tracking in the Physical and motor section.
  • On the next screen, turn it on Eye tracking toggle and follow the on-screen instructions to set up eye tracking on your iPhone. All you need to do is look at the colored dots one by one.
Enable eye tracking in iOS 18 on iPhone
  • For better results, set up eye tracking by placing your iPhone or iPad on a stable surface, about 5 meters away from your face. Also avoid blinking during this process.
Eye tracking in iOS 18

Once you’re done, do the following:

  • Turn something on Occupancy control. This will help you touch the highlighted item by simply holding your gaze for a moment.
  • Leave Snap to element option enabled. This is an important setting that places a frame around the item you are currently looking at.
  • The Auto hide This option allows you to set the time after which the cursor will reappear. By default it is set to 0.50 seconds. Depending on your preferences, you can choose from 0.10 to 4 seconds.

How to use eye tracking on iPhone

When you turn on eye tracking on your iPhone, an invisible cursor will track and follow your eye movements. When you look at an interactive element on a website or app, the system will highlight it and surround it with a rectangular frame. To touch or select an item on the screen, just hold it for a moment and the selected action will be performed.

When you turn on eye tracking in iOS 18, it automatically turns on Apple’s AssistiveTouch feature, which works together quite nicely. AssistiveTouch lets you do more just by looking at your iPhone. You’ll see the AssistiveTouch button, a circle in the lower right corner that provides quick access to various shortcuts and options that typically require swiping or other gestures. This will show you Scroll options to scroll up, down, up or down the screen. You can also slide your finger across the screen horizontally or vertically. In most cases, these options work quite accurately.

AssistiveTouch and eye tracking on iPhone

With AssistiveTouch and Eye Tracking, you can activate Siri, go to the home page, lock the screen, display Control Center, rotate the screen, adjust the volume and much more on your iPhone just by looking at it.

Configure hot corners with eye tracking

We all know how to set up and use Hot Corners on Mac. With Assisted Touch and Dwell Control options, you can also use hot corners on iPhone and iPad. Interestingly, the latest eye tracking feature in iOS 18 elevates the experience. With eye tracking and hot corners turned on, you can quickly trigger action by simply looking at that corner of the screen. For example, I set the top right corner to go to the home screen. By default, the Calibrate Eye Tracking action is assigned to the upper left corner.

Here’s how you can do it:

Note:
Before you begin, make sure you have Assistive Touch and Dwell Control enabled.

  • On iPhone or iPad, go to Settings -> Accessibility -> Touch.
  • Select here Supportive touch and tap Hot corners.
  • You can now assign your preferred actions to the top left, top right, bottom left and bottom right corners of your screen.
Set up hot corners on iPhone

How accurate is eye tracking on iPhones?

Eye Tracking works incredibly well on Apple Vision Pro, all thanks to advanced LEDs and infrared cameras. However, Apple’s eye tracking isn’t the most accurate on the iPhone and iPad because these devices don’t come with high-end tracking cameras. Instead, iPhones and iPads use front-facing cameras and device intelligence to track eye movements.

Let’s talk about the good things first. Setting up eye tracking on iPhone is easy and works quite quickly. Using eye tracking on iPhone is fun, but it’s not perfect or error-free. Sometimes my iPhone doesn’t highlight the item I’m currently looking at. So I often struggle and try different eye and head movements, so the system highlights what I’m about to do. This most often happens when I scroll through pages that contain a lot of text, such as the settings app. That said, browsing the home screen and opening photos was quite accurate. Additionally, switching between various settings such as Bluetooth, Wi-Fi, airplane mode, and more was quite quick and precise in most cases. That said, the overall experience was rather average and nothing impressive.

I used eye tracking on iPhone 12, iPhone 14 Pro and iPad Pro M2. I also tried eye tracking on the iPhone 16 Plus and iPhone 16 Pro Max when I went to check them out at the Apple Store. I find that eye tracking works slightly better on the iPad compared to iPhones. However, I think Apple has a lot of catching up to do to make eye tracking in iOS 18 an impressive feature.

Here are some suggestions on how to get the most out of Eye Tracking on iPhones and iPads:

  • Keep your iPhone/iPad on a stable surface and away (about 5 meters) from your face. If the device is too close to your face, eye tracking will not work properly.
  • If you are holding the device in your hand, remember to hold it as still as possible. If you move away from the device or change location, you may need to recalibrate.
  • If eye tracking appears to be disabled, turn it off and set it up again.
  • Be sure to sit in a well-lit area. Eye tracking may not function properly in dark or poorly lit rooms.
  • Avoid sitting in front of a light source. This can blur your face and make it harder for the camera to see and track your eyes.

Turn off eye tracking on iPhone

There are two ways to turn off eye tracking on iPhone. We have discussed both below:

  • Go to Settings -> Accessibility -> Eye trackingand turn off the eye tracking switch. From the pop-up window, press OK to confirm your decision.
Turn off eye tracking in iOS 18 on iPhone
  • Alternatively, you can also add an eye tracking button in the customizable Control Center in iOS 18. Once you’ve added a control, you can tap it to turn iPhone eye tracking on/off directly from the Control Center.

This is how you can use eye tracking on iPhones. Have you tried this feature? How did it work for you? Tell us in the comments below.

Can iPhones track eyesight?

In the latest iOS 18, iPhone 12 and later versions support eye tracking. The all-new Eye Tracking feature in iOS 18 allows users to navigate their iPhones using their eyes.

Why don’t I have eye tracking on my iPhone?

It’s worth knowing that eye tracking isn’t available on all iOS 18 devices. Only iPhone SE 3 and iPhone 12 or later models have eye tracking.

Where is eye tracking on iPhone?

Eye tracking is available in the Accessibility section of your iPhone settings. Go to Settings -> Accessibility -> Eye Tracking.

Does iPhone 13 have eye tracking?

Yes, iPhone 13 supports Apple eye tracking.

Can I use iPhone Mirroring with eye tracking?

No, you cannot use iPhone mirroring with eye tracking. This is because iPhone Mirroring only works when the iPhone is locked and not in use. However, when you turn on eye tracking on your iPhone, the front camera will always be used. Additionally, eye tracking remains active even when your iPhone is locked.

How do I turn off eye tracking on my iPhone?

Open Settings -> Accessibility -> Eye Tracking and turn off the Eye Tracking switch.