当前位置: X-MOL 学术ACM Trans. Comput. Hum. Interact. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Multimodal Coordination Measures to Understand Users and Tasks
ACM Transactions on Computer-Human Interaction ( IF 4.8 ) Pub Date : 2020-11-08 , DOI: 10.1145/3412365
Siyuan Chen 1 , Julien Epps 2
Affiliation  

Physiological and behavioral measures allow computing devices to augment user interaction experience by understanding their mental load. Current techniques often utilize complementary information between different modalities to index load level typically within a specific task. In this study, we propose a new approach utilizing the timing between physiology/behavior change events to index low and high load level of four task types. Findings from a user study where eye, speech, and head movement data were collected from 24 participants demonstrate that the proposed measures are significantly different between low and high load levels with high effect size. It was also found that voluntary actions are more likely to be coordinated during tasks. Implications for the design of multimodal-multisensor interfaces include (i) utilizing event change and interaction in multiple modalities is feasible to distinguish task load levels and load types and (ii) voluntary actions should be allowed for effective task completion.

中文翻译:

了解用户和任务的多模式协调措施

生理和行为测量允许计算设备通过了解用户的心理负荷来增强用户交互体验。当前技术通常利用不同模式之间的互补信息来索引特定任务中的负载水平。在这项研究中,我们提出了一种新方法,利用生理/行为变化事件之间的时间来索引四种任务类型的低负荷和高负荷水平。一项用户研究的结果表明,从 24 名参与者那里收集了眼睛、语音和头部运动数据,结果表明,所提出的措施在具有高影响大小的低负荷和高负荷水平之间存在显着差异。还发现,自愿行动更有可能在任务期间得到协调。
更新日期:2020-11-08
down
wechat
bug