Skip to main content
Log in

A mixed reality platform for assembly assistance based on gaze interaction in industry

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

A novel platform for remote collaboration is proposed in this paper. This platform makes it possible for an expert to assist a worker for industrial assembly task in different places. Our goal is to compare the effect of sharing head pointer in a SAR remote collaboration with AR annotations in manufacturing. First, we develop an AR remote collaborative platform. This platform implements sharing a remote expert’s head pointer instead of eye gaze. Then, we evaluate the prototype system comparing two conditions through a user study, AR annotations and gaze cues (GC), relating to their effectiveness in the assembly efficiency, numbers of incorrect operation, workload, and collaborative experience. The results show that sharing head pointer can improve performance, co-presence awareness, and user collaborative experiences and decreases the numbers of incorrect operations. More importantly, we implement GC visualization in low-cost head tracking and find that it acts as a good referential pointing. Therefore, head pointer could be a competent representation of GC in AR/MR remote collaboration for assembly assistance. Our research has a great practical significance for the industrial application of GC-based AR/MR remote collaboration.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12
Figure 13
Figure 14

Similar content being viewed by others

Notes

  1. https://pupil-labs.com

  2. https://www.baslerweb.com

  3. https://github.com/Microsoft/MixedRealityToolkit-Unity

  4. https://unity3d.com/cn/unity

  5. https://www.microsoft.com/en-us/hololens

  6. https://humansystems.arc.nasa.gov/groups/tlx/

    http://www.nasatlx.com/

References

  1. Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of augmented reality assembly research. Adv Manuf 1:1–22

    Article  Google Scholar 

  2. Wang S, Parsons M, Stone-McLean J, Rogers P, Boyd S, Hoover K, Meruvia-Pastor O, Gong M, Smith A (2017) Augmented reality as a telemedicine platform for remote procedural training. Sensors-Basel. 17:2294

    Article  Google Scholar 

  3. Anton D, Kurillo G, Bajcsy R (2018) User experience and interaction performance in 2D/3D telecollaboration. Futur Gener Comput Syst 82:77–88

    Article  Google Scholar 

  4. Gurevich P, Lanir J, Cohen B (2015) Design and implementation of TeleAdvisor: a projection-based augmented reality system for remote collaboration. Computer Supported Cooperative Work (CSCW) 24:527–562

    Article  Google Scholar 

  5. Wang P, Zhang S, Bai X, Billinghurst M, He W, Wang S, Zhang X, Du J, Chen Y. (2019) Head pointer or eye gaze: which helps more in MR remote collaboration. 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

  6. Wang P, Zhang S, Bai X, Billinghurst M, He W, Sun M, Chen Y, Lv H, Ji H (2019) 2.5DHANDS: a gesture-based MR remote collaborative platform. Int J Adv Manuf Technol 102(5-8):1339–1353

    Article  Google Scholar 

  7. Wang P , Zhang S , Bai X , Billinghurst M, He W, Zhang L, Du J, Wang S. (2018) Do you know what I mean? An MR-based collaborative platform[C]// 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 2018.

  8. Zubizarreta J, Aguinaga I, Amundarain A (2019) A framework for augmented reality guidance in industry. Int J Adv Manuf Technol 102(3):4095–4108

    Article  Google Scholar 

  9. Wang P, Bai X, Billinghurst M, Zhang S, Han D, Sun M, Wang Z, Lv H, Han S (2020) Haptic feedback helps me? A VR-SAR remote collaborative system with tangible interaction. Int J Human Comput Interaction 2020:1–16

    Google Scholar 

  10. Mourtzis D, Zogopoulos V, Xanthi F (2019) Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling. Int J Adv Manuf Technol 105(9):1–12

    Article  Google Scholar 

  11. Werrlich S, Daniel A, Ginger A, Nguyen PA, Notni G (2018) Comparing HMD-based and paper-based training[C]// 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE 2018:134–142

    Google Scholar 

  12. Hietanen A, Pieters R, Lanz M, Latokartano J, Kämäräinen J (2020) AR-based interaction for human-robot collaborative manufacturing[J]. Robot Comput Integr Manuf 63:101891

    Article  Google Scholar 

  13. Gergle D, Kraut RE, Fussell SR (2013) Using visual information for grounding and awareness in collaborative tasks. Hum Comput Interact 28(1):1–39

    Google Scholar 

  14. Balakrishnan R. (2007) Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task. In Sigchi Conference on Human Factors in Computing Systems, 1177–1186.

  15. Piumsomboon T, Dey A, Ens B, Lee G, Billinghurst M (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 2019, 6:5.

  16. Piumsomboon T, Lee G A, Hart J D, Ens B, Lindeman R W, Thomas B H, Billinghurst M (2018). Mini-me: an adaptive avatar for mixed reality remote collaboration. In Chi Conference on Human Factors in Computing Systems.46:1-13.

  17. Piumsomboon T, Lee GA, Ens B, Thomas BH, Billinghurst M (2018) Superman vs giant: a study on spatial perception for a multi-scale mixed reality flying telepresence interface. Ieee T Vis Comput Gr 24:2974–2982

    Article  Google Scholar 

  18. Roo J S, Basset J, Cinquin P A, Hachet M. (2018) Understanding users capability to transfer information between mixed and virtual reality: position estimation across modalities and perspectives. CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems April 2018Paper No.: 363 Pages 1–12.

  19. Funk M. (2016) Augmented reality at the workplace a context-aware assistive system using in-situ projection. PHD.

  20. Kraut RE, Fussell SR, Siegel J (2003) Visual information as a conversational resource in collaborative physical tasks. J Vocat Behav 18:13–49

    Google Scholar 

  21. Huang W, Alem L, Tecchia F, Duh HBL (2018) Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration. J Multimodal User Interfaces 12:77–89

    Article  Google Scholar 

  22. Akkil D, Isokoski P (2019) Comparison of gaze and mouse pointers for video-based collaborative physical task. Interact Comput 6:6

    Google Scholar 

  23. Brennan SE, Chen X, Dickinson CA, Neider MB, Zelinsky GJ (2008) Coordinating cognition: the costs and benefits of shared gaze during collaborative search. Cognition, 2008 106(3):1465–1477

    Google Scholar 

  24. D’Angelo S, Begel A. Improving communication between pair programmers using shared gaze awareness[C] CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems May 2017 Pages 6245–6255.

  25. Gupta K, Lee G, Billinghurst M (2016) Do you see what I see? The effect of gaze tracking on task space remote collaboration. Ieee T Vis Comput Gr 22:2413–2422

    Article  Google Scholar 

  26. Wang P, Bai X, Billinghurst M, Zhang S, He W, Han D, Wang Y, Min H, Lan W, Han S (2020) Using a head pointer or eye gaze: the effect of gaze on spatial AR remote collaboration for physical tasks. Interact Comput 32(2):153–169

    Article  Google Scholar 

  27. Kritzler M, Murr M, Michahelles F. (2016) RemoteBob support of on-site workers via a telepresence remote expert system. 6th International Conference on the Internet of Things (IoT‘16), November 07-09, 2016, Stuttgart, Germany.

  28. Anton D, Kurillo G, Yang AY and Bajcsy R. (2017) Augmented telemedicine platform for real-time remote medical consultation. 2017: 77-89.

  29. Günther S, Kratz S , Avrahami D, Mühlhäuser M. (2018) Exploring audio, visual, and tactile cues for synchronous remote assistance[C]// Pervasive Technologies Related to Assistive Environments Conference. 2018:339-344.

  30. Huang W, Kim S, Billinghurst M, Alem L (2018) Sharing hand gesture and sketch cues in remote collaboration. J Vis Commun Image Represent 2018

  31. Wang P, Bai X, Billinghurst M, Zhang S, Wei S, Xu G, He W, Zhang X, Zhang J (2020) 3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration. Multimed Tools Appl:1–28

  32. Zhu K, Chen T, Han F, Wu Y S (2019). HapTwist: Creating interactive haptic proxies in virtual reality using low-cost twistable artefacts. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–13). Glasgow, Scotland.

  33. Whitmire E, Benko H, Holz C, Ofek E, Sinclair M (2018) Haptic revolver touch, shear, texture, and shape rendering on a reconfigurable Virtual reality controller. CHI Conference Human Factors Comput Syst 42:54–60. https://doi.org/10.1093/indlaw/dws040

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank all the participants in this experiment; they carried out the experiment and filled in the questionnaire carefully, which brought great help for our research.

Availability of data and materials

All data generated or analyzed during this study are included in this published article.

Funding

This research was financially sponsored by the National Key R&D Program of China (2019YFB1703800) and 111 project (B13044).

Author information

Authors and Affiliations

Authors

Contributions

Zenglei Wang finished the draft, and Shusheng Zhang and Xiaoliang Bai improved the paper. Moreover, Xiaoliang Bai designed the MR prototype system.

Corresponding authors

Correspondence to Shusheng Zhang or Xiaoliang Bai.

Ethics declarations

Ethics approval

Not applicable.

Consent to participate

All participants consent to participate the experiment.

Consent for publication

All participants consent to publish the related data, and all authors consent to publish the research.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, Z., Zhang, S. & Bai, X. A mixed reality platform for assembly assistance based on gaze interaction in industry. Int J Adv Manuf Technol 116, 3193–3205 (2021). https://doi.org/10.1007/s00170-021-07624-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-021-07624-z

Keywords

Navigation