Skip to main content
Log in

Hand-adaptive user interface: improved gestural interaction in virtual reality

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Most interactive user interfaces (UIs) for virtual reality (VR) applications are based on the traditional eye-centred UI design principle, which primarily considers the user’s visual searching efficiency and comfort, but the hand operation performance and ergonomics are relatively less considered. As a result, the hand interaction in VR is often criticized as being less efficient and precise. In this paper, the user’s arm movement features, such as the choice of the hand being used and hand interaction position, are hypothesized to influence the interaction results derived from a VR study. To verify this, we conducted a free hand target selection experiment with 24 participants. The results showed that (a) the hand choice had a significant effect on the target selection results: for a left hand interaction, the targets located in spaces to the left were selected more efficiently and accurately than those in spaces to the right; however, in a right hand interaction, the result was reversed, and (b) the free hand interactions at lower positions were more efficient and accurate than those at higher positions. Based on the above findings, this paper proposes a hand-adaptive UI technique to improve free hand interaction performance in VR. A comprehensive comparison between the hand-adaptive UI and traditional eye-centred UI was also conducted. It was shown that the hand-adaptive UI resulted in a higher interaction efficiency and a lower physical exertion and perceived task difficulty than the traditional UI.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. This e-shopping VR application was supported by the ‘Buy+’ program from Taobao corporation (https://www.taobao.com), Alibaba group.

References

  • Adamovich SV, Fluet GG, Tunik E, Merians AS (2009) Sensorimotor training in virtual reality: a review. NeuroRehabilitation 25(1):29–44. https://doi.org/10.3233/nre-2009-0497

    Article  Google Scholar 

  • Bacca J, Baldiris S, Fabregat R, Graf S, Kinshuk K (2014) Augmented reality trends in education: a systematic review of research and applications. Educ Technol Soc 17(4):133–149

    Google Scholar 

  • Borg GA (1982) Psychophysical bases of perceived exertion. Med Sci Sports Exerc 14(5):377–381

    Article  Google Scholar 

  • Boring S, Jurmu M, Butz A (2009) Scroll, tilt or move it: using mobile phones to continuously control pointers on large public displays. In: Proceedings of the 21st annual conference of the Australian computer–human interaction special interest group: design. ACM, pp 161–168. https://doi.org/10.1145/1738826.1738853

  • Bowman DA (1999) Interaction techniques for common tasks in immersive virtual environments—design, evaluation, and application. Ann Rheum Dis 28(3):37–53

    Google Scholar 

  • Brooke J (1996) SUS-A quick and dirty usability scale. Usability Eval Ind 189(194):4–7

    Google Scholar 

  • Cho OH, Lee WH (2012) Gesture recognition using simple-OpenNI for implement interactive contents. Future Inf Technol Appl Serv 179:141–146

    Google Scholar 

  • Colby CL (1998) Action-oriented spatial reference frames in cortex. Neuron 20(1):15–24. https://doi.org/10.1016/S0896-6273(00)80429-8

    Article  Google Scholar 

  • Danckert J, Goodale MA (2001) Superior performance for visually guided pointing in the lower visual field. Exp Brain Res 137(3):303–308. https://doi.org/10.1007/s002210000653

    Article  Google Scholar 

  • Fikkert FW (2010) Gesture interaction at a distance. Universiteit Twente, Enschede

    Book  Google Scholar 

  • Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 121(3):381–391

    Article  Google Scholar 

  • Gerber D, Bechmann D (2005) The spin menu: a menu system for virtual environments. Proc IEEE Virtual Real 2005:271–272. https://doi.org/10.1109/VR.2005.1492790

    Article  Google Scholar 

  • Haque F, Nancel M, Vogel D (2015) Myopoint: pointing and clicking using forearm mounted electromyography and inertial motion sensors. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems. ACM, Seoul, Republic of Korea, pp 3653–3656. https://doi.org/10.1145/2702123.2702133

  • Harrison C, Ramamurthy S, Hudson SE (2012) On-body interaction: armed and dangerous. In: Proceedings of the sixth international conference on tangible, embedded and embodied interaction. ACM, Kingston, Ontario, Canada, pp 69–76. https://doi.org/10.1145/2148131.2148148

  • Hart SG (2006) Nasa-task load index (Nasa-TLX); 20 years later. Hum Factors Ergon Soc Annu Meet Proc 50(9):904–908

    Article  Google Scholar 

  • Hincapié-Ramos JD, Guo X, Moghadasian P, Irani P (2014) Consumed endurance: a metric to quantify arm fatigue of mid-air interactions. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, Toronto, Ontario, Canada, pp 1063–1072. https://doi.org/10.1145/2556288.2557130

  • Lemmerman DK, LaViola JJ (2007) Effects of interaction-display offset on user performance in surround screen virtual environments. Proc IEEE Virtual Real 2007:303–304. https://doi.org/10.1109/VR.2007.352513

    Article  Google Scholar 

  • Liu G, Chua R, Enns JT (2008) Attention for perception and action: task interference for action planning, but not for online control. Exp Brain Res 185(4):709–717. https://doi.org/10.1007/s00221-007-1196-5

    Article  Google Scholar 

  • Lou X, Peng R, Hansen P, Li XA (2018) Effects of user’s hand orientation and spatial movements on free hand interactions with large displays. Int J Hum Comput Interact 34(6):519–532

    Article  Google Scholar 

  • Lubos P, Bruder G, Steinicke F (2014) Analysis of direct selection in head-mounted display environments. In: Proceedings of the 2014 IEEE symposium on 3D user interfaces (3DUI), pp 11–18. https://doi.org/10.1109/3dui.2014.6798834

  • Lubos P, Bruder G, Ariza O, Steinicke F (2016) Touching the sphere: leveraging joint-centered kinespheres for spatial user interaction. In: Proceedings of the 2016 symposium on spatial user interaction. ACM, Tokyo, Japan, pp 13–22. https://doi.org/10.1145/2983310.2985753

  • MacKenzie IS, Kauppinen T, Silfverberg M (2001) Accuracy measures for evaluating computer pointing devices. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, Seattle, Washington, USA, pp 9–16. https://doi.org/10.1145/365024.365028

  • Mackinlay J, Card SK, Robertson GG (1991) A semantic analysis of the design space of input devices. ACM Trans Inf Syst 9(2):99–122

    Article  Google Scholar 

  • Mäkelä V, Heimonen T, Turunen M (2014) Magnetic cursor: improving target selection in freehand pointing interfaces. In: Proceedings of the international symposium on pervasive displays. ACM, Copenhagen, Denmark, pp 112–117. https://doi.org/10.1145/2611009.2611025

  • Mine MR, Frederick P, Brooks J, Sequin CH (1997) Moving objects in space: exploiting proprioception in virtual-environment interaction. In: Proceedings of the 24th annual conference on computer graphics and interactive techniques, pp 19–26. https://doi.org/10.1145/258734.258747

  • Murata A, Iwase H (2001) Extending Fitts’ law to a three-dimensional pointing task. Hum Mov Sci 20(6):791–805. https://doi.org/10.1016/S0167-9457(01)00058-6

    Article  Google Scholar 

  • Nancel M, Pietriga E, Chapuis O, Beaudouin-Lafon M (2015) Mid-air pointing on ultra-walls. ACM Trans Comput Hum Interact: TOCHI 22(5):1–62

    Article  Google Scholar 

  • Ohta Y, Tamura H (2014) Mixed reality: merging real and virtual worlds. Springer, Berlin

    Google Scholar 

  • Oldfield RC (1971) The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9(1):97–113

    Article  Google Scholar 

  • Po BA, Fisher BD, Booth KS (2004) Mouse and touchscreen selection in the upper and lower visual fields. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, Vienna, Austria, pp 359–366. https://doi.org/10.1145/985692.985738

  • Previc FH (1990) Functional specialization in the lower and upper visual fields in humans: its ecological origins and neurophysiological implications. Behav Brain Sci 13(3):519–542. https://doi.org/10.1017/S0140525X00080018

    Article  Google Scholar 

  • Ren G, O’Neill E (2013) 3D selection with freehand gesture. Comput Graph 37(3):101–120. https://doi.org/10.1016/j.cag.2012.12.006

    Article  Google Scholar 

  • Sagayam KM, Hemanth DJ (2016) Hand posture and gesture recognition techniques for virtual reality applications: a survey. Virtual Real. https://doi.org/10.1007/s10055-016-0301-0

    Article  Google Scholar 

  • Shoemaker G, Tsukitani T, Kitamura Y, Booth KS (2010) Body-centric interaction techniques for very large wall displays. In: Proceedings of the 6th Nordic conference on human–computer interaction: extending boundaries. ACM, Reykjavik, Iceland, pp 463–472. https://doi.org/10.1145/1868914.1868967

  • Soukoreff RW, MacKenzie IS (2004) Towards a standard for 1040 pointing device evaluation, perspectives on 27 years of Fitts’ Law research in HCI. Int J Hum Comput Stud 61(6):751–789

    Article  Google Scholar 

  • Tanii K, Kogi K, Sadoyama T (1972) Spontaneous alternation of the working arm in static overhead work. J Hum Ergol 1(2):143–155

    Google Scholar 

  • Travis DS (1990) Applying visual psychophysics to user interface design. Behav Inf Technol 9(5):425–438

    Article  Google Scholar 

  • Usoh M, Catena E, Arman S, Slater M (2006) Using presence questionnaires in reality. Presence Teleoperator Virtual Environ 9(5):497–503

    Article  Google Scholar 

  • Vogel D, Balakrishnan R (2005) Distant freehand pointing and clicking on very large, high resolution displays. In: Proceedings of the 18th annual ACM symposium on user interface software and technology. ACM, Seattle, WA, USA, pp 33–42. https://doi.org/10.1145/1095034.1095041

  • Ware C, Arsenault R (2004) Frames of reference in virtual object rotation. In: Proceedings of the 1st symposium on applied perception in graphics and visualization. ACM, Los Angeles, California, USA, pp 135–141. https://doi.org/10.1145/1012551.1012576

  • Wingrave CA, Bowman DA (2005) Baseline factors for raycasting selection. In: Proceedings of HCI international 2005. Las Vegas, Nevada, USA, pp 1–10

Download references

Acknowledgements

This research was supported by the (National Natural Science Foundation of China) under Grant (61902097), the (Zhejiang Provincial Natural Science Funding) under Grant (Q19F020010), the (Open Research Funding of State Key Laboratory for Novel Software Technology of Nanjing University) under Grant (KFKT2019B18), and the (Open Research Funding of State Key Laboratory of Virtual Reality Technology and Systems) under Grant (VRLAB2020B03).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiaolong Lou.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lou, X., Li, X.A., Hansen, P. et al. Hand-adaptive user interface: improved gestural interaction in virtual reality. Virtual Reality 25, 367–382 (2021). https://doi.org/10.1007/s10055-020-00461-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-020-00461-7

Keywords

Navigation