当前位置: X-MOL 学术J. Multimodal User Interfaces › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Interactive gaze and finger controlled HUD for cars
Journal on Multimodal User Interfaces ( IF 2.9 ) Pub Date : 2019-11-23 , DOI: 10.1007/s12193-019-00316-9
Gowdham Prabhakar , Aparna Ramakrishnan , Modiksha Madan , L. R. D. Murthy , Vinay Krishna Sharma , Sachin Deshmukh , Pradipta Biswas

Modern infotainment systems in automobiles facilitate driving at the cost of secondary tasks in addition to the primary task of driving. These secondary tasks have considerable chance to distract a driver from his primary driving task, thereby reducing safety or increasing cognitive workload. This paper presents an intelligent interactive head up display (HUD) on the windscreen of the driver that does not require them to take eyes off from road while undertaking secondary tasks like playing music, operating vent controls, watching navigation map and so on. The interactive HUD allows interaction in the form of pointing and selection just like traditional graphical user interfaces, however tracking operators’ eye gaze or finger movements. Additionally, the system can also estimate drivers’ cognitive load and distraction level. User studies show the system improves driving performance in terms of mean deviation from lane in an ISO 26022 lane changing task compared to touchscreen system and participants can undertake ISO 9241 pointing tasks in less than 2 s on average inside a Toyota Etios car.

中文翻译:

交互式注视和手指控制的HUD汽车

汽车中的现代信息娱乐系统除了主要的驾驶任务外,还以次要任务为代价来促进驾驶。这些次要任务有很大的机会分散驾驶员的主要驾驶任务,从而降低安全性或增加认知工作量。本文在驾驶员的挡风玻璃上展示了一种智能交互式平视显示器(HUD),在执行诸如播放音乐,操作通风口控件,观看导航地图等次要任务时,驾驶员无需将目光从道路上移开。交互式HUD像传统的图形用户界面一样,允许以指向和选择的形式进行交互,但是可以跟踪操作员的视线或手指运动。另外,系统还可以估计驾驶员的认知负荷和注意力分散程度。
更新日期:2019-11-23
down
wechat
bug