Abstract
Using touchscreens while driving introduces competition for visual attention that increases crash risk. To resolve this issue, we have developed an auditory-supported air gesture system. We conducted two experiments using the driving simulator to investigate the influence of this system on driving performance, eye glance behavior, secondary task performance, and driver workload. In Experiment 1 we investigated the impact of menu layout and auditory displays with 23 participants. In Experiment 2 we compared the best systems from Experiment 1 with equivalent touchscreen systems with 24 participants. Results from Experiment 1 showed that menus arranged in 2 × 2 grids outperformed systems with 4 × 4 grids across all measures and also demonstrated that auditory displays can be used to reduce visual demands of in-vehicle controls. In Experiment 2 auditory-supported air gestures allowed drivers to look at the road more, showed equivalent driver workload and driving performance, and slightly decreased secondary task performance compared to touchscreens. Implications are discussed with multiple resources theory and Fitts’s law.
Similar content being viewed by others
References
Horrey W, Wickens C (2007) In-vehicle glance duration: distributions, tails, and model of crash risk. Transp Res Rec J Transp Res Board 2018:22–28
Klauer SG et al (2006) The impact of driver inattention on near-crash/crash risk: an analysis using the 100-car naturalistic driving study data. (FHWA-HRT-04-138). National Highway Traffic Safety Administration, Washington. Retrieved from http://www.nhtsa.gov/DOT/NHTSA/NRD/Multimedia/PDFs/Crash%20Avoidance/2006/DriverInattention.pdf
Olson RL, Hanowski RJ, Hickman JS, Bocanegra J (2009) Driver distraction in commercial vehicle operations (No. FMCSA-RRT-09-042). United States, Federal Motor Carrier Safety Administration
Green P (2000) Crashes induced by driver information systems and what can be done to reduce them. In: SAE conference proceedings; 1999
Burnett Gary E, Summerskill Steve J, Porter Jack M (2004) On-the-move destination entry for vehicle navigation systems: unsafe by any means? Behav Inf Technol 23(4):265–272
Sodnik Jaka et al (2008) A user study of auditory versus visual interfaces for use while driving. Int J Hum Comput Stud 66(5):318–332
Riener Andreas (2012) Gestural interaction in vehicular applications. Computer 4:42–47
May KR, Gable TM, Walker BN (2014) A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct proceedings of the 6th international conference on automotive user interfaces and interactive vehicular applications. ACM
Gable TM et al (2015) Exploring and evaluating the capabilities of Kinect v2 in a driving simulator environment. In: Proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications. ACM
Wickens C (2002) Multiple resources and performance prediction. Theor Issues Ergon Sci 3(2):159–177
Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4(1):67–94
Edwards AD (1989) Soundtrack: an auditory interface for blind users. Hum Comput Interact 4(1):45–66
Jeon M, Walker BN (2009) “Spindex”: accelerated initial speech sounds improve navigation performance in auditory menus. In: Proceedings of the human factors and ergonomics society annual meeting, vol 53, no. 17. SAGE Publications
Jeon Myounghoon, Walker Bruce N (2011) Spindex (speech index) improves auditory menu acceptance and navigation performance. ACM Trans Access Comput (TACCESS) 3(3):10
Fitts P (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47(6):381
Fitts P, Peterson J (1964) Information capacity of discrete motor responses. J Exp Psychol 67(2):103
MacKenzie I Scott (1992) Fitts’ law as a research and design tool in human–computer interaction. Humancomput Interact 7(1):91–139
Akyol S, Canzler U, Bengler K, Hahn W (2000) Gesture control for use in automobiles. In: IAPR MVA workshop, pp 349–352
Ohn-bar E, Tran C, Trivedi M (2012) Hand gesturebased visual user interface for infotainment. In: AutomotiveUI’12, pp 111–115
Cairnie N, Ricketts IW, Mckenna SJ, Mcallister G (2000) Using finger-pointing to operate secondary controls in automobiles. In: Intelligent vehicles symposium, pp 550–555
Rahman ASM, Saboune J, El Saddik A, Ave KE (2011) Motion-path based in car gesture control of the multimedia devices. In: MPH, pp 69–75
Alpern M, Minardo K (2003) Developing a car gesture interface for use as a secondary task. In: CHI’03, p 932
Wu S, Gable T, May K, Choi YM, Walker BN (2016) Comparison of surface gestures and air gestures for in-vehicle menu navigation. Arch Des Res 29(4):65–80
May K, Gable TM, Wu X, Sardesai RR, Walker BN (2016) Choosing the right air gesture: impacts of menu length and air gesture type on driver workload. In: Adjunct proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications, pp 69–74
Department of Transportation (2012) National highway traffic safety administration. Visual-manual NHTSA driver distraction guidelines for in-vehicle electronic devices
Sterkenburg J, Landry S, Jeon M, Johnson J (2016) Towards an in-vehicle sonically-enhanced gesture control interface: a pilot study. In: Proceedings of the International Conference on Auditory Display. https://doi.org/10.21785/icad2016.015
Sterkenburg J, Landry S, Jeon M (2017) Influences of visual and auditory displays on aimed movements using air gesture controls. In: Proceedings of the International Conference on Auditory Display, pp 81–85. https://doi.org/10.21785/icad2017.065
Hart S, Staveland L (1988) Development of NASATLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183
Alm Håkan, Nilsson Lena (1995) The effects of a mobile telephone task on driver behaviour in a car following situation. Accid Anal Prev 27(5):707–715
Strayer David L, Drew Frak A (2004) Profiles in driver distraction: effects of cell phone conversations on younger and older drivers. Hum Factors 46(4):640–649
Hatfield BC, Wyatt WR, Shea JB (2010) Effects of auditory feedback on movement time in a Fitts task. J Mot Behav 42(5):289–293
Zhao S, Dragicevic P, Chignell M, Balakrishnan R, Baudisch P (2007) Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: CHI proceedings of the SIGCHI conference on human factors in computing systems, pp 1395–1404
Drews FA, Yazdani H, Godfrey CN, Cooper JM, Strayer DL (2009) Text messaging during simulated driving. Hum Factors 51:762–770
Parasuraman R, Sheridan TB, Wickens CD (2008) Situation awareness, mental workload, and trust in automation: viable, empirically supported cognitive engineering constructs. J Cogn Eng Decis Mak 2(2):140–160
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sterkenburg, J., Landry, S. & Jeon, M. Design and evaluation of auditory-supported air gesture controls in vehicles. J Multimodal User Interfaces 13, 55–70 (2019). https://doi.org/10.1007/s12193-019-00298-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-019-00298-8