The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, available at https://github.com/aangelopoulos/event_based_gaze_tracking , we demonstrate that our system achieves accuracies of 0.45 degrees--1.75 degrees for fields of view from 45 degrees to 98 degrees. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
翻译:现代视视跟踪系统中的照相机受到基本的带宽和电源限制,数据采集速度限制在300赫兹的现实状态下。这阻碍了使用移动眼睛跟踪器来操作,例如低纬度预测显示,或研究快速和微妙的眼动,例如使用野生头顶装置的微分卡;在这里,我们提议一个混合框架-活动近视跟踪系统,提供超过10 000赫兹的更新率,精确度与在同一条件下评价高端台式台式商业跟踪器的精确度相当。我们的系统以同时获得定期抽样框架和适应性抽样活动的新兴事件相机为基础。我们开发了一种在线2D学生安装方法,每一次或几次更新一个准模模型。此外,我们提议建立一个多度反射器,用于实时估计对准学生模型的视点。我们使用第一个以事件为基础的凝视数据集,其精确度与在同一条件下评价的高端台式桌面商业跟踪器相匹配。我们的系统以新事件相机为基础,同时获取定期抽样框架和适应性抽样活动。我们展示了一种在线2D适应方法,从98至45度的超度显示度。