Skip to main content
Log in

Design and evaluation of auditory-supported air gesture controls in vehicles

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Using touchscreens while driving introduces competition for visual attention that increases crash risk. To resolve this issue, we have developed an auditory-supported air gesture system. We conducted two experiments using the driving simulator to investigate the influence of this system on driving performance, eye glance behavior, secondary task performance, and driver workload. In Experiment 1 we investigated the impact of menu layout and auditory displays with 23 participants. In Experiment 2 we compared the best systems from Experiment 1 with equivalent touchscreen systems with 24 participants. Results from Experiment 1 showed that menus arranged in 2 × 2 grids outperformed systems with 4 × 4 grids across all measures and also demonstrated that auditory displays can be used to reduce visual demands of in-vehicle controls. In Experiment 2 auditory-supported air gestures allowed drivers to look at the road more, showed equivalent driver workload and driving performance, and slightly decreased secondary task performance compared to touchscreens. Implications are discussed with multiple resources theory and Fitts’s law.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Horrey W, Wickens C (2007) In-vehicle glance duration: distributions, tails, and model of crash risk. Transp Res Rec J Transp Res Board 2018:22–28

    Article  Google Scholar 

  2. Klauer SG et al (2006) The impact of driver inattention on near-crash/crash risk: an analysis using the 100-car naturalistic driving study data. (FHWA-HRT-04-138). National Highway Traffic Safety Administration, Washington. Retrieved from http://www.nhtsa.gov/DOT/NHTSA/NRD/Multimedia/PDFs/Crash%20Avoidance/2006/DriverInattention.pdf

  3. Olson RL, Hanowski RJ, Hickman JS, Bocanegra J (2009) Driver distraction in commercial vehicle operations (No. FMCSA-RRT-09-042). United States, Federal Motor Carrier Safety Administration

  4. Green P (2000) Crashes induced by driver information systems and what can be done to reduce them. In: SAE conference proceedings; 1999

  5. Burnett Gary E, Summerskill Steve J, Porter Jack M (2004) On-the-move destination entry for vehicle navigation systems: unsafe by any means? Behav Inf Technol 23(4):265–272

    Article  Google Scholar 

  6. Sodnik Jaka et al (2008) A user study of auditory versus visual interfaces for use while driving. Int J Hum Comput Stud 66(5):318–332

    Article  Google Scholar 

  7. Riener Andreas (2012) Gestural interaction in vehicular applications. Computer 4:42–47

    Article  Google Scholar 

  8. May KR, Gable TM, Walker BN (2014) A multimodal air gesture interface for in vehicle menu navigation. In: Adjunct proceedings of the 6th international conference on automotive user interfaces and interactive vehicular applications. ACM

  9. Gable TM et al (2015) Exploring and evaluating the capabilities of Kinect v2 in a driving simulator environment. In: Proceedings of the 7th international conference on automotive user interfaces and interactive vehicular applications. ACM

  10. Wickens C (2002) Multiple resources and performance prediction. Theor Issues Ergon Sci 3(2):159–177

    Article  Google Scholar 

  11. Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4(1):67–94

    Article  MathSciNet  Google Scholar 

  12. Edwards AD (1989) Soundtrack: an auditory interface for blind users. Hum Comput Interact 4(1):45–66

    Article  Google Scholar 

  13. Jeon M, Walker BN (2009) “Spindex”: accelerated initial speech sounds improve navigation performance in auditory menus. In: Proceedings of the human factors and ergonomics society annual meeting, vol 53, no. 17. SAGE Publications

  14. Jeon Myounghoon, Walker Bruce N (2011) Spindex (speech index) improves auditory menu acceptance and navigation performance. ACM Trans Access Comput (TACCESS) 3(3):10

    Google Scholar 

  15. Fitts P (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47(6):381

    Article  Google Scholar 

  16. Fitts P, Peterson J (1964) Information capacity of discrete motor responses. J Exp Psychol 67(2):103

    Article  Google Scholar 

  17. MacKenzie I Scott (1992) Fitts’ law as a research and design tool in human–computer interaction. Humancomput Interact 7(1):91–139

    Article  Google Scholar 

  18. Akyol S, Canzler U, Bengler K, Hahn W (2000) Gesture control for use in automobiles. In: IAPR MVA workshop, pp 349–352

  19. Ohn-bar E, Tran C, Trivedi M (2012) Hand gesturebased visual user interface for infotainment. In: AutomotiveUI’12, pp 111–115

  20. Cairnie N, Ricketts IW, Mckenna SJ, Mcallister G (2000) Using finger-pointing to operate secondary controls in automobiles. In: Intelligent vehicles symposium, pp 550–555

  21. Rahman ASM, Saboune J, El Saddik A, Ave KE (2011) Motion-path based in car gesture control of the multimedia devices. In: MPH, pp 69–75

  22. Alpern M, Minardo K (2003) Developing a car gesture interface for use as a secondary task. In: CHI’03, p 932

  23. Wu S, Gable T, May K, Choi YM, Walker BN (2016) Comparison of surface gestures and air gestures for in-vehicle menu navigation. Arch Des Res 29(4):65–80

    Google Scholar 

  24. May K, Gable TM, Wu X, Sardesai RR, Walker BN (2016) Choosing the right air gesture: impacts of menu length and air gesture type on driver workload. In: Adjunct proceedings of the 8th international conference on automotive user interfaces and interactive vehicular applications, pp 69–74

  25. Department of Transportation (2012) National highway traffic safety administration. Visual-manual NHTSA driver distraction guidelines for in-vehicle electronic devices

  26. Sterkenburg J, Landry S, Jeon M, Johnson J (2016) Towards an in-vehicle sonically-enhanced gesture control interface: a pilot study. In: Proceedings of the International Conference on Auditory Display. https://doi.org/10.21785/icad2016.015

  27. Sterkenburg J, Landry S, Jeon M (2017) Influences of visual and auditory displays on aimed movements using air gesture controls. In: Proceedings of the International Conference on Auditory Display, pp 81–85. https://doi.org/10.21785/icad2017.065

  28. Hart S, Staveland L (1988) Development of NASATLX (Task Load Index): results of empirical and theoretical research. Adv Psychol 52:139–183

    Article  Google Scholar 

  29. Alm Håkan, Nilsson Lena (1995) The effects of a mobile telephone task on driver behaviour in a car following situation. Accid Anal Prev 27(5):707–715

    Article  Google Scholar 

  30. Strayer David L, Drew Frak A (2004) Profiles in driver distraction: effects of cell phone conversations on younger and older drivers. Hum Factors 46(4):640–649

    Article  Google Scholar 

  31. Hatfield BC, Wyatt WR, Shea JB (2010) Effects of auditory feedback on movement time in a Fitts task. J Mot Behav 42(5):289–293

    Article  Google Scholar 

  32. Zhao S, Dragicevic P, Chignell M, Balakrishnan R, Baudisch P (2007) Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: CHI proceedings of the SIGCHI conference on human factors in computing systems, pp 1395–1404

  33. Drews FA, Yazdani H, Godfrey CN, Cooper JM, Strayer DL (2009) Text messaging during simulated driving. Hum Factors 51:762–770

    Article  Google Scholar 

  34. Parasuraman R, Sheridan TB, Wickens CD (2008) Situation awareness, mental workload, and trust in automation: viable, empirically supported cognitive engineering constructs. J Cogn Eng Decis Mak 2(2):140–160

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Myounghoon Jeon.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sterkenburg, J., Landry, S. & Jeon, M. Design and evaluation of auditory-supported air gesture controls in vehicles. J Multimodal User Interfaces 13, 55–70 (2019). https://doi.org/10.1007/s12193-019-00298-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-019-00298-8

Keywords

Navigation