Site icon AppleMagazine

How to Use Eye Tracking on visionOS for Hands-Free Control

A person wearing Apple Vision Pro headset, with a visionOS interface displaying a glowing cursor controlled by eye tracking, highlighting hands-free navigation features like scrolling and app selection, set against a sleek, mixed-reality background.

To begin using eye tracking, you’ll need to calibrate the Vision Pro to recognize your gaze. The setup process is straightforward and takes about a minute. Follow these steps:

  1. Access Settings: Put on your Vision Pro and navigate to the Settings app. Scroll to the Eyes & Hands section (or Accessibility > Interaction for more options).

  2. Enable Eye Tracking: Select Eye Tracking and toggle it on. A pop-up will guide you through calibration.

  3. Follow the Dot: A moving dot will appear on the screen, stopping at various points. Look at the dot as it moves to calibrate the system to your eye movements. For best results, remain stationary and ensure the headset fits correctly, as misalignment can affect accuracy.

  4. Confirm Calibration: Once complete, a black dot (the Dwell Pointer) will appear on-screen, tracking your gaze. This acts as a cursor, replacing finger-based input.

If tracking feels off, you can recalibrate by pressing the top left button on the headset four times or by going to Settings > Eyes & Hands > Redo Eye Setup. Siri can also initiate this process with a command like, “Siri, set up eyes.”

Using Eye Tracking for Navigation

Eye tracking on visionOS serves as the primary targeting system, functioning like a mouse pointer or touchscreen hover. Here’s how it works:

These interactions rely on the Vision Pro’s internal infrared cameras and LEDs, which project light patterns to track eye movements with precision, enhanced by the R1 chip’s foveated rendering for efficient display processing.

Customizing Eye Tracking Settings

To tailor the experience, adjust settings in Settings > Accessibility > Interaction or Pointer Control:

For optimal performance, place the Vision Pro on a stable surface about one foot from your face, avoid glare or reflections (especially if wearing glasses), and consider using Dark Mode to enhance contrast. Cult of Mac notes that the Vision Pro’s multiple cameras outperform the single front-facing camera on iPhones, but accuracy may still vary compared to the headset’s controlled environment.

Accessibility and Practical Applications

Eye tracking is a cornerstone of Vision Pro’s accessibility, designed to assist users with physical disabilities who may struggle with hand gestures or touch inputs. As Apple highlights, visionOS supports a flexible input system, allowing control via eyes, voice, or a combination, with features like Switch Control and Sound Actions complementing eye tracking. Ryan Hudson-Peralta, an accessibility consultant, praised Vision Pro as “the most accessible technology I’ve ever used,” noting its seamless functionality for users with limited mobility.

Beyond accessibility, eye tracking enhances everyday use. For instance, hands-free scrolling in visionOS 3 could make browsing emails or websites more comfortable than flicking a thumb on an iPhone. The ability to trigger actions like opening the Control Center or typing on a virtual keyboard by gaze alone streamlines tasks, especially in mixed-reality environments where physical inputs may be impractical.

Why It Matters

Eye tracking on visionOS represents a leap in human-computer interaction, aligning with Apple’s “it just works” philosophy. UploadVR reports that users find this system more intuitive than controller-based VR headsets, thanks to the seamless integration of eye and hand tracking. The Vision Pro’s 14 cameras, including four infrared cameras for eye tracking, and the R1 chip ensure precise, low-latency performance, setting a new standard for mixed-reality interfaces.

For tech users, this technology offers practical benefits: navigating apps without lifting a hand, controlling devices in hands-busy scenarios (like cooking or working), and enabling inclusive access for those with motor impairments. As Apple prepares to unveil visionOS 3 at WWDC 2025 on June 9, the addition of eye-based scrolling could further elevate the Vision Pro’s appeal, potentially surpassing iPhone navigation in ease of use.

Troubleshooting and Tips

If eye tracking feels inaccurate, try these tips from The Verge and Cult of Mac:

If issues persist, contact Apple Support or file a Feedback Assistant request to report specific use-case challenges, as eye-tracking data isn’t directly accessible to developers for privacy reasons.

Looking Ahead

With visionOS 3, expected in fall 2025, eye tracking will become even more powerful, particularly with hands-free scrolling across Apple’s built-in apps and third-party developer support. Bloomberg’s Mark Gurman notes that this feature leverages existing hardware, ensuring no additional cost for users. As Apple continues to refine visionOS, eye tracking could redefine how we interact with spatial computing, making the Vision Pro a must-have for tech enthusiasts and accessibility-focused users alike.

Exit mobile version