With built-in eye-tracking cameras, the Apple Vision Pro (AVP) enables gaze-based interaction, eye image rendering on external screens, and iris recognition for device unlocking. One of the technological advancements of the AVP is its heavy reliance on gaze- and gesture-based interaction. However, limited information is available regarding the specifics of the eye-tracking device in the AVP, and raw gaze data is inaccessible to developers. This study evaluated the eye-tracking accuracy of the AVP, leveraging foveated rendering, and examined how tracking accuracy relates to user-reported usability. The results revealed an overall gaze error of 2.5{\deg} (or 61.95 pixels) within a tested field of view (FOV) of approximately 34{\deg} x 18{\deg}. As expected, the lowest gaze error was observed in the central FOV, with higher gaze errors in peripheral areas. The usability and learnability scores of the AVP, measured using the standard System Usability Scale (SUS), were 73 and 70, respectively. Importantly, no statistically reliable correlation between gaze error and usability scores was found. These results suggest that the eye-tracking accuracy of the AVP is comparable to other VR/AR headsets. While eye-tracking accuracy is critical for gaze-based interaction, it is not the sole determinant of user experience in AR/VR.
翻译:暂无翻译