Skip to main content
Log in

Walking-in-place for omnidirectional VR locomotion using a single RGB camera

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Locomotion is a fundamental interaction element allowing navigation inside the virtual environment, and the walking-in-place (WIP) techniques have been actively developed as a balanced compromise between naturalness and efficiency. One popular method to implement the WIP technique was to use a low-cost, easy to set up, and markerless Kinect, but required integration of multiple sensors or covered limited directions due to the poor tracking capability when facing non-frontal sides of the user. This study aimed to propose a WIP technique for omnidirectional VR locomotion based on a single RGB camera, utilizing an open-source 2D human pose estimation system called OpenPose. Three WIP techniques (existing Kinect-based technique, proposed Kinect-based technique, and proposed OpenPose-based technique) were compared in terms of variation of virtual walking speed and subjective evaluation through a user study with walking tasks in different directions. Experimental results showed that the proposed OpenPose-based technique performed comparably when the user faced the front of the camera, but it induced lower variation of virtual walking speed and higher subjective evaluation ratings at non-forward directions compared to other techniques. The proposed OpenPose-based WIP technique can be used in VR applications to provide a fully unobstructed VR locomotion experience. It can achieve stable WIP-based omnidirectional VR locomotion through a single low-cost easily accessible RGB camera, without the need for additional sensors, and at the same time, both hands are free for other interactions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data availability

The datasets and codes generated and/or analyzed during the current study are not publicly available due to privacy and IP concerns but are available from the corresponding author on reasonable request.

References

  • Ahmed H, Nazarahari M, Ahmad R, Rouhani H (2021) In-field instrumented ergonomic risk assessment: inertial measurement units versus Kinect V2. Int J Ind Ergon 84:103147. https://doi.org/10.1016/j.ergon.2021.103147

    Article  Google Scholar 

  • Aitpayev K, Gaber J (2012) Collision Avatar (CA): Adding collision objects for human body in augmented reality using Kinect. In: 2012 6th international conference on application of information and communication technologies (AICT). IEEE, pp 1–4

  • Al Zayer M, MacNeilage P, Folmer E (2018) Virtual locomotion: a survey. IEEE Trans Vis Comput Graph. https://doi.org/10.1109/TVCG.2018.2887379

    Article  Google Scholar 

  • Alexander J, Han T, Judd W, et al (2012) Putting your best foot forward: investigating real-world mappings for foot-based gestures. In: Proceedings of the 2012 ACM annual conference on human factors in computing systems - CHI ’12. ACM Press, New York, New York, USA, p 1229

  • Ang Y, Sulaiman PS, Rahmat RWOK, Mohd Norowi N (2019) Swing-in-place (SIP): a less fatigue walking-in-place method with side-viewing functionality for mobile virtual reality. IEEE Access 7:183985–183995. https://doi.org/10.1109/ACCESS.2019.2960409

    Article  Google Scholar 

  • Bakker NH, Passenier PO, Werkhoven PJ (2003) Effects of head-slaved navigation and the use of teleports on spatial orientation in virtual environments. Hum Factors J Hum Factors Ergon Soc 45:160–169. https://doi.org/10.1518/hfes.45.1.160.27234

    Article  Google Scholar 

  • Bhandari J, Tregillus S, Folmer E (2017) Legomotion: scalable walking-based virtual locomotion. In: Proceedings of the 23rd ACM symposium on virtual reality software and technology. ACM, New York, NY, USA, pp 1–8

  • Bogo F, Kanazawa A, Lassner C et al (2016) Keep It SMPL: automatic estimation of 3D human pose and shape from a single image. In: Leibe B, Matas J, Sebe N, Welling M (eds) Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics). Springer, Cham, pp 561–578

    Google Scholar 

  • Bohannon RW, Williams Andrews A (2011) Normal walking speed: a descriptive meta-analysis. Physiotherapy 97:182–189. https://doi.org/10.1016/j.physio.2010.12.004

    Article  Google Scholar 

  • Bouguila L, Evequoz F, Courant M, Hirsbrunner B (2004) Walking-pad: a step-in-place locomotion interface for virtual environments. In: Proceedings of the 6th international conference on multimodal interfaces - ICMI ’04. ACM Press, New York, New York, USA, p 77

  • Bowman DA, Koller D, Hodges LF (1998) A methodology for the evaluation of travel techniques for immersive virtual environments. Virtual Real 3:120–131. https://doi.org/10.1007/BF01417673

    Article  Google Scholar 

  • Bowman DA, Koller D, Hodges LF (1997) Travel in immersive virtual environments: an evaluation of viewpoint motion control techniques. In: Proceedings of IEEE 1997 annual international symposium on virtual reality. pp 45–52

  • Bozgeyikli E, Raij A, Katkoori S, Dubey R (2016) Point & teleport locomotion technique for virtual reality. In: Proceedings of the 2016 annual symposium on computer-human interaction in play. ACM, New York, NY, USA, pp 205–216

  • Bruno L, Pereira J, Jorge J (2013) A new approach to walking in place. In: Kotzé P, Marsden G, Lindgaard G, Wesson J, Winckler M (eds) Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics). Springer, Berlin, pp 370–387

    Google Scholar 

  • Bruno L, Sousa M, Ferreira A et al (2017) Hip-directed walking-in-place using a single depth camera. Int J Hum Comput Stud 105:1–11. https://doi.org/10.1016/j.ijhcs.2017.03.006

    Article  Google Scholar 

  • Cakmak T, Hager H (2014) Cyberith virtualizer: a locomotion device for virtual reality. In: ACM SIGGRAPH 2014 emerging technologies, SIGGRAPH 2014

  • Cao Z, Hidalgo Martinez G, Simon T et al (2019) OpenPose: realtime multi-person 2D pose estimation using part affinity fields. IEEE Trans Pattern Anal Mach Intell. https://doi.org/10.1109/TPAMI.2019.2929257

    Article  Google Scholar 

  • Cao Z, Simon T, Wei SE, Sheikh Y (2017) Realtime multi-person 2D pose estimation using part affinity fields. In: Proceedings - 30th IEEE conference on computer vision pattern recognition, CVPR 2017 January pp 1302–1310. Doi: https://doi.org/10.1109/CVPR.2017.143

  • Chance SS, Gaunet F, Beall AC, Loomis JM (1998) Locomotion mode affects the updating of objects encountered during travel: the contribution of vestibular and proprioceptive inputs to path integration. Presence Teleoperators Virtual Environ 7:168–178. https://doi.org/10.1162/105474698565659

    Article  Google Scholar 

  • Cohen J (1988) Statistical power analysis for the behavioural science, 2nd edn. Elsevier, Hillsdale, NJ

    Google Scholar 

  • Darken RP, Cockayne WR, Carmein D (1997) The omni-directional treadmill: a locomotion device for virtual worlds. In: Proceedings of the 10th annual ACM symposium on user interface software and technology - UIST ’97. ACM Press, New York, New York, USA, pp 213–221

  • Feasel J, Whitton MC, Wendt JD (2008) LLCM-WIP: low-latency, continuous-motion walking-in-place. In: 2008 IEEE Symposium on 3D user interfaces. IEEE, pp 97–104

  • Felberbaum Y, Lanir J (2018) Better understanding of foot gestures: an elicitation study. In: Proceedings of the 2018 CHI conference on human factors in computing systems. ACM, New York, NY, USA, pp 1–12

  • Francese R, Passero I, Tortora G (2012) Wiimote and Kinect: gestural user interfaces add a natural third dimension to HCI. In: Proceedings of the international working conference on advanced visual interfaces - AVI ’12. ACM Press, New York, New York, USA, p 116

  • Ha G, Lee S, Cha J, Lee H, Kim T, Kim S (2015) A real-time sensing of gait and viewing direction for human interaction in virtual training applications. In: Stephanidis C (eds) HCI International 2015 - Posters’ Extended Abstracts. HCI 2015. Communications in Computer and Information Science, vol 528. Springer, Cham. https://doi.org/10.1007/978-3-319-21380-4_82

  • Hale KS, Stanney KM (eds) (2014) Handbook of virtual environments: design, implementation, and applications, 2nd edn. CRC Press, London

    Google Scholar 

  • Iwata H (1999) The torus treadmill: realizing locomotion in VEs. IEEE Comput Graph Appl 19:30–35. https://doi.org/10.1109/38.799737

    Article  Google Scholar 

  • Ke P, Zhu K (2021) Larger step faster speed: investigating gesture-amplitude-based locomotion in place with different virtual walking speed in virtual reality. In: 2021 IEEE virtual reality and 3D user interfaces (VR). IEEE, pp 438–447

  • Ketoma VK, Schäfer P, Meixner G (2018) Development and evaluation of a virtual reality grocery shopping application using a multi-kinect walking-in-place approach. In: Karwowski W, Ahram T (eds) Advances in intelligent systems and computing. Springer, Cham, pp 368–374

    Google Scholar 

  • Kim T, Ju H, Cooperstock JR (2018) Pressure or movement? usability of multi-functional foot-based interfaces. In: Proceedings of the 2018 designing interactive systems conference. ACM, New York, NY, USA, pp 1219–1227

  • Kim W, Huang C, Yun D et al (2020) Comparison of joint angle measurements from three types of motion capture systems for ergonomic postural assessment. In: Karwowski W, Goonetilleke R, Xiong S, Goossens RMA (eds) Advances in physical, social & occupational ergonomics. Springer, Cham, pp 3–11

    Chapter  Google Scholar 

  • Kim W, Sung J, Saakes D et al (2021) Ergonomic postural assessment using a new open-source human pose estimation technology (OpenPose). Int J Ind Ergon 84:103164. https://doi.org/10.1016/j.ergon.2021.103164

    Article  Google Scholar 

  • Kim W, Xiong S (2021) User-defined walking-in-place gestures for VR locomotion. Int J Hum Comput Stud 152:102648. https://doi.org/10.1016/j.ijhcs.2021.102648

    Article  Google Scholar 

  • LaViola JJ Jr, Kruijff E, McMahan RP et al (2017) 3D user interfaces: theory and practice, 2nd edn. Addison-Wesley Professional, Boston, MA

    Google Scholar 

  • Lee J, Ahn SC, Hwang J-I (2018) A walking-in-place method for virtual reality using position and orientation tracking. Sensors 18:2832. https://doi.org/10.3390/s18092832

    Article  Google Scholar 

  • Lee J, Kim GJ, Chul Ahn S, Hwang J-I (2019) MIP-VR: an omnidirectional navigation and jumping method for VR shooting game using IMU. In: 2019 IEEE international conference on consumer electronics (ICCE). IEEE, pp 1–3

  • McCullough M, Xu H, Michelson J, et al (2015) Myo arm: swinging to explore a VE. In: Proceedings of the ACM SIGGRAPH symposium on applied perception - SAP ’15. ACM Press, New York, New York, USA, pp 107–113

  • Müller F, McManus J, Günther S, et al (2019) Mind the tap: assessing foot-taps for interacting with head-mounted displays. In: Proceedings of the 2019 CHI conference on human factors in computing systems. ACM, New York, NY, USA, pp 1–13

  • Nakano N, Sakura T, Ueda K et al (2020) Evaluation of 3D markerless motion capture accuracy using OpenPose with multiple video cameras. Front Sport Act Living. https://doi.org/10.3389/fspor.2020.00050

    Article  Google Scholar 

  • Nilsson NC, Serafin S, Laursen MH, et al (2013) Tapping-in-place: increasing the naturalness of immersive walking-in-place locomotion through novel gestural input. In: 2013 IEEE symposium on 3D user interfaces (3DUI). IEEE, pp 31–38

  • Nilsson NC, Serafin S, Nordahl R (2014) Establishing the range of perceptually natural visual walking speeds for virtual walking-in-place locomotion. IEEE Trans Vis Comput Graph 20:569–578. https://doi.org/10.1109/TVCG.2014.21

    Article  Google Scholar 

  • Papazoglou N, Loussidis P, Giannoulopoulos A et al (1991) Comparison of respiratory response of jogging in place and bruce treadmill exercise test. Clin Cardiol 14:913–916. https://doi.org/10.1002/clc.4960141111

    Article  Google Scholar 

  • Plantard P, Auvinet E, Pierres A-S, Multon F (2015) Pose estimation with a kinect for ergonomic studies: evaluation of the accuracy using a virtual mannequin. Sensors 15(1):1785–1803. https://doi.org/10.3390/s150101785

    Article  Google Scholar 

  • Pterneas V (2020) Vitruvius. In: LightBuzz. https://vitruviuskinect.com/. Accessed 11 Jan 2021

  • Rogers S (2019) 2019: The year virtual reality gets real. In: Forbes. https://www.forbes.com/sites/solrogers/2019/06/21/2019-the-year-virtual-reality-gets-real/. Accessed 3 Jan 2021

  • Settgast V, Lancelle M, Bauer D et al (2014) Hands-free navigation in immersive environments for the evaluation of the effectiveness of indoor navigation systems. J Virtual Real Broadcast. https://doi.org/10.20385/1860-2037/11.2014.4

    Article  Google Scholar 

  • Slater M, Usoh M, Steed A (1995) Taking steps: the influence of a walking technique on presence in virtual reality. ACM Trans Comput Interact 2:201–219. https://doi.org/10.1145/210079.210084

    Article  Google Scholar 

  • Souman JL, Giordano PR, Schwaiger M et al (2011) CyberWalk: enabling unconstrained omnidirectional walking through virtual environments. ACM Trans Appl Percept 8:1–22. https://doi.org/10.1145/2043603.2043607

    Article  Google Scholar 

  • Stauffert JP, Niebling F, Latoschik ME (2020) Latency and cybersickness: impact, causes and measures. A review. Front Virtual Real 1:31

    Article  Google Scholar 

  • Teixeira L, Vilar E, Duarte E, Noriega P, Rebelo F, da Silva FM (2013) Strategy for the development of a walk-in-place interface for virtual reality. In: Marcus A (eds) Design, user experience, and usability. user experience in novel technological environments. DUXU 2013. Lecture notes in computer science, vol 8014. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39238-2_46

  • Templeman JN, Denbrook PS, Sibert LE (1999) Virtual locomotion: walking in place through virtual environments. Presence Teleoperators Virtual Environ 8:598–617. https://doi.org/10.1162/105474699566512

    Article  Google Scholar 

  • Tregillus S, Folmer E (2016) VR-STEP: walking-in-place using inertial sensing for hands free navigation in mobile VR environments. In: Proceedings of the 2016 chi conference on human factors in computing systems. ACM, New York, NY, USA, pp 1250–1255

  • Usoh M, Arthur K, Whitton MC, et al (1999) Walking > walking-in-place > flying, in virtual environments. In: Proceedings of the 26th annual conference on computer graphics and interactive techniques - SIGGRAPH ’99. ACM Press, New York, New York, USA, pp 359–364

  • Viswakumar A, Rajagopalan V, Ray T, Parimi C (2019) Human gait analysis using OpenPose. In: 2019 fifth international conference on image information processing (ICIIP). IEEE, pp 310–314

  • von Willich J, Schmitz M, Müller F, et al (2020) Podoportation: foot-based locomotion in virtual reality. In: Proceedings of the 2020 CHI conference on human factors in computing systems. ACM, New York, NY, USA, pp 1–14

  • Wang Q, Kurillo G, Ofli F, Bajcsy R (2015) Evaluation of pose tracking accuracy in the first and second generations of microsoft Kinect. In: Proc—2015 IEEE Int Conf Healthc Informatics, ICHI 2015, pp 380–389. https://doi.org/10.1109/ICHI.2015.54

  • Wei T, Lee B, Qiao Y, et al (2015) Experimental study of skeleton tracking abilities from microsoft kinect non-frontal views. In: 2015 3DTV-conference: the true vision - capture, transmission and display of 3D video (3DTV-CON). IEEE, pp 1–4

  • Wendt JD, Whitton MC, Brooks FP (2010) GUD WIP: gait-understanding-driven walking-in-place. In: 2010 IEEE virtual reality conference (VR). IEEE, pp 51–58

  • Williams B, Bailey S, Narasimham G et al (2011) Evaluation of walking in place on a Wii balance board to explore a virtual environment. ACM Trans Appl Percept 8:1–14. https://doi.org/10.1145/2010325.2010329

    Article  Google Scholar 

  • Williams B, McCaleb M, Strachan C, Zheng Y (2013) Torso versus gaze direction to navigate a VE by walking in place. In: Proceedings of the ACM symposium on applied perception - SAP ’13. ACM Press, New York, New York, USA, p 67

  • Wilson PT, Kalescky W, MacLaughlin A, Williams B (2016) VR locomotion: walking > walking in place > arm swinging. In: Proceedings of the 15th ACM SIGGRAPH conference on virtual-reality continuum and its applications in industry - VRCAI ’16. ACM Press, New York, New York, USA, pp 243–249

  • Wilson PT, Nguyen K, Harris A, Williams B (2014) Walking in place using the Microsoft Kinect to explore a large VE. In: Proceedings of the 13th ACM SIGGRAPH international conference on virtual-reality continuum and its applications in industry - VRCAI ’14. ACM Press, New York, New York, USA, pp 27–33

  • Xu W, Liang H-N, Zhao Y, et al (2019) DMove: directional motion-based interaction for augmented reality head-mounted displays. In: Proceedings of the 2019 CHI conference on human factors in computing systems. ACM, New York, NY, USA, pp 1–14

  • Yang W, Ouyang W, Wang X, et al (2018) 3D human pose estimation in the wild by adversarial learning. In: 2018 IEEE/CVF conference on computer vision and pattern recognition. IEEE, pp 5255–5264

  • Zhao T, Hidalgo G, Sheikh Y (2020) OpenPose unity plugin. In: GitHub. https://github.com/CMU-Perceptual-Computing-Lab/openpose_unity_plugin. Accessed 1 Dec 2020

  • Zheng Y, McCaleb M, Strachan C, Williams B (2012) Exploring a virtual environment by walking in place using the Microsoft Kinect. In: Proceedings of the ACM symposium on applied perception - SAP ’12. ACM Press, New York, New York, USA, p 131

  • Zhou X, Huang Q, Sun X, et al (2017) Towards 3D human pose estimation in the wild: a weakly-supervised approach. In: 2017 IEEE international conference on computer vision (ICCV). IEEE, pp 398–407

  • Zielinski DJ, McMahan RP, Brady RB (2011) Shadow walking: an unencumbered locomotion technique for systems with under-floor projection. In: 2011 IEEE virtual reality conference, pp 167–170

Download references

Acknowledgements

This work was supported by the Basic Science Research Program through the National Research Foundation of Korea funded by the Ministry of Science, ICT and Future Planning (NRF-2020R1F1A1048510).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shuping Xiong.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Ethical approval

The experimental protocol was approved by the University Institutional Review Board (KH2020-069) and all participants gave consent to participate and publish.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kim, W., Sung, J. & Xiong, S. Walking-in-place for omnidirectional VR locomotion using a single RGB camera. Virtual Reality 26, 173–186 (2022). https://doi.org/10.1007/s10055-021-00551-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-021-00551-0

Keywords

Navigation