当前位置: X-MOL 学术AI EDAM › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Automated areas of interest analysis for usability studies of tangible screen-based user interfaces using mobile eye tracking
AI EDAM ( IF 2.1 ) Pub Date : 2020-09-11 , DOI: 10.1017/s0890060420000372
M. Batliner , S. Hess , C. Ehrlich-Adám , Q. Lohmeyer , M. Meboldt

The user's gaze can provide important information for human–machine interaction, but the analysis of manual gaze data is extremely time-consuming, inhibiting wide adoption in usability studies. Existing methods for automated areas of interest (AOI) analysis cannot be applied to tangible products with a screen-based user interface (UI), which have become ubiquitous in everyday life. The objective of this paper is to present and evaluate a method to automatically map the user's gaze to dynamic AOIs on tangible screen-based UIs based on computer vision and deep learning. This paper presents an algorithm forautomated Dynamic AOI Mapping(aDAM), which allows the automated mapping of gaze data recorded with mobile eye tracking to the predefined AOIs on tangible screen-based UIs. The evaluation of the algorithm is performed using two medical devices, which represent two extreme examples of tangible screen-based UIs. The different elements of aDAM are examined for accuracy and robustness, as well as the time saved compared to manual mapping. The break-even point for an analyst's effort for aDAM compared to manual analysis is found to be 8.9 min gaze data time. The accuracy and robustness of both the automated gaze mapping and the screen matching indicate that aDAM can be applied to a wide range of products. aDAM allows, for the first time, automated AOI analysis of tangible screen-based UIs with AOIs that dynamically change over time. The algorithm requires some additional initial input for the setup and training, but analyzed gaze data duration and effort is only determined by computation time and does not require any additional manual work thereafter. The efficiency of the approach has the potential for a broader adoption of mobile eye tracking in usability testing for the development of new products and may contribute to a more data-driven usability engineering process in the future.

中文翻译:

使用移动眼动追踪对基于有形屏幕的用户界面进行可用性研究的自动化兴趣领域分析

用户的注视可以为人机交互提供重要信息,但手动注视数据的分析非常耗时,阻碍了可用性研究的广泛采用。现有的自动兴趣区域 (AOI) 分析方法无法应用于具有基于屏幕的用户界面 (UI) 的有形产品,这些产品已在日常生活中无处不在。本文的目的是介绍和评估一种方法,该方法可以基于计算机视觉和深度学习,在基于有形屏幕的 UI 上自动将用户的视线映射到动态 AOI。本文提出了一种算法自动动态 AOI 映射(aDAM),它允许将通过移动眼动追踪记录的凝视数据自动映射到基于有形屏幕的 UI 上的预定义 AOI。该算法的评估是使用两个医疗设备执行的,这两个医疗设备代表了基于有形屏幕 UI 的两个极端示例。检查 aDAM 的不同元素的准确性和稳健性,以及与手动映射相比节省的时间。与手动分析相比,分析师对 aDAM 的努力的收支平衡点是 8.9 分钟的注视数据时间。自动注视映射和屏幕匹配的准确性和稳健性表明 aDAM 可以应用于广泛的产品。aDAM 首次允许使用随时间动态变化的 AOI 对基于屏幕的有形 UI 进行自动 AOI 分析。该算法需要一些额外的初始输入来进行设置和训练,但分析的凝视数据持续时间和工作量仅由计算时间决定,此后不需要任何额外的手动工作。该方法的效率有可能在可用性测试中更广泛地采用移动眼动追踪来开发新产品,并可能有助于未来更多数据驱动的可用性工程过程。
更新日期:2020-09-11
down
wechat
bug