当前位置: X-MOL 学术IEEE Trans. Vis. Comput. Graph. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz
IEEE Transactions on Visualization and Computer Graphics ( IF 4.7 ) Pub Date : 2021-03-29 , DOI: 10.1109/tvcg.2021.3067784
Anastasios N. Angelopoulos 1 , Julien N.P. Martel 2 , Amit P. Kohli 1 , Jorg Conradt 3 , Gordon Wetzstein 2
Affiliation  

The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45°-1.75° for fields of view from 45° to 98°. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.

中文翻译:

超过 10,000 Hz 的基于事件的近眼凝视跟踪

现代凝视跟踪系统中的相机受到基本带宽和功率限制的影响,实际将数据采集速度限制在 300 Hz。这阻碍了移动眼动仪的使用来执行,例如,低延迟预测渲染,或在野外使用头戴式设备研究快速和微妙的眼球运动,如微扫视。在这里,我们提出了一种基于混合帧事件的近眼凝视跟踪系统,其更新率超过 10,000 Hz,在相同条件下进行评估时,其精度可与高端台式商用跟踪器相匹配。我们的系统(在图 1 中进行了预览)建立在新兴的事件摄像机之上,这些摄像机同时获取定期采样的帧和自适应采样的事件。我们开发了一种在线 2D 瞳孔拟合方法,该方法可以每隔一个或几个事件更新一个参数模型。而且,我们提出了一个多项式回归器,用于实时估计参数瞳孔模型的注视点。使用第一个基于事件的凝视数据集,我们证明了我们的系统在 45° 到 98° 的视野范围内实现了 0.45°-1.75° 的精度。通过这项技术,我们希望为虚拟现实和增强现实启用新一代超低延迟凝视随机渲染和显示技术。
更新日期:2021-04-16
down
wechat
bug