当前位置: X-MOL 学术Hum. Comput. Interact. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Detecting Intention Through Motor-Imagery-Triggered Pupil Dilations
Human-Computer Interaction ( IF 5.3 ) Pub Date : 2017-08-17 , DOI: 10.1080/07370024.2017.1293540
David Rozado 1 , Martin Lochner 1 , Ulrich Engelke 1 , Andreas Dünser 1
Affiliation  

Human–computer interaction systems that bypass manual control can be beneficial for many use cases, including users with severe motor disability. We investigated pupillometry (inferring mental activity via dilations of the pupil) as an interaction method because it is noninvasive, easy to analyse, and increasingly available for practical development. In 3 experiments we investigated the efficacy of using pupillometry to detect imaginary motor movements of the hand. In Experiment 1 we demonstrated that, on average, the pupillary response is greater when the participant is imagining a hand-grasping motion, as compared with the control condition. In Experiment 2 we investigated how imaginary hand-grasping affects the pupillary response over time. In Experiment 3 we employed a simple classifier to demonstrate single-trial detection of imagined motor events using pupillometry. Using the mean pupil diameter of a single trial, accuracy rates as high as 71.25%, were achieved. Implications for the development of a pupillometry-based switch and future directions are discussed.



中文翻译:

通过运动图像触发的瞳孔扩张检测意图

绕过手动控制的人机交互系统在许多用例中都可能是有益的,包括严重运动障碍的用户。我们研究了瞳孔测量法(通过瞳孔的扩张来推断智力活动)作为一种交互方法,因为它是无创的,易于分析的,并且越来越可用于实际开发。在3个实验中,我们研究了使用光瞳法检测手部假想运动的功效。在实验1中,我们证明,与对照条件相比,参与者在想象抓手运动时,瞳孔反应平均更大。在实验2中,我们研究了假想的手抓握如何随时间影响瞳​​孔反应。在实验3中,我们使用一个简单的分类器来演示使用瞳孔测量法对想象中的运动事件进行单次试验检测。使用单项试验的平均瞳孔直径,可以达到71.25%的准确率。讨论了对基于光瞳法转换的发展和未来方向的影响。

更新日期:2017-08-17
down
wechat
bug