Abstract
A novel platform for remote collaboration is proposed in this paper. This platform makes it possible for an expert to assist a worker for industrial assembly task in different places. Our goal is to compare the effect of sharing head pointer in a SAR remote collaboration with AR annotations in manufacturing. First, we develop an AR remote collaborative platform. This platform implements sharing a remote expert’s head pointer instead of eye gaze. Then, we evaluate the prototype system comparing two conditions through a user study, AR annotations and gaze cues (GC), relating to their effectiveness in the assembly efficiency, numbers of incorrect operation, workload, and collaborative experience. The results show that sharing head pointer can improve performance, co-presence awareness, and user collaborative experiences and decreases the numbers of incorrect operations. More importantly, we implement GC visualization in low-cost head tracking and find that it acts as a good referential pointing. Therefore, head pointer could be a competent representation of GC in AR/MR remote collaboration for assembly assistance. Our research has a great practical significance for the industrial application of GC-based AR/MR remote collaboration.
Similar content being viewed by others
References
Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of augmented reality assembly research. Adv Manuf 1:1–22
Wang S, Parsons M, Stone-McLean J, Rogers P, Boyd S, Hoover K, Meruvia-Pastor O, Gong M, Smith A (2017) Augmented reality as a telemedicine platform for remote procedural training. Sensors-Basel. 17:2294
Anton D, Kurillo G, Bajcsy R (2018) User experience and interaction performance in 2D/3D telecollaboration. Futur Gener Comput Syst 82:77–88
Gurevich P, Lanir J, Cohen B (2015) Design and implementation of TeleAdvisor: a projection-based augmented reality system for remote collaboration. Computer Supported Cooperative Work (CSCW) 24:527–562
Wang P, Zhang S, Bai X, Billinghurst M, He W, Wang S, Zhang X, Du J, Chen Y. (2019) Head pointer or eye gaze: which helps more in MR remote collaboration. 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
Wang P, Zhang S, Bai X, Billinghurst M, He W, Sun M, Chen Y, Lv H, Ji H (2019) 2.5DHANDS: a gesture-based MR remote collaborative platform. Int J Adv Manuf Technol 102(5-8):1339–1353
Wang P , Zhang S , Bai X , Billinghurst M, He W, Zhang L, Du J, Wang S. (2018) Do you know what I mean? An MR-based collaborative platform[C]// 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 2018.
Zubizarreta J, Aguinaga I, Amundarain A (2019) A framework for augmented reality guidance in industry. Int J Adv Manuf Technol 102(3):4095–4108
Wang P, Bai X, Billinghurst M, Zhang S, Han D, Sun M, Wang Z, Lv H, Han S (2020) Haptic feedback helps me? A VR-SAR remote collaborative system with tangible interaction. Int J Human Comput Interaction 2020:1–16
Mourtzis D, Zogopoulos V, Xanthi F (2019) Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling. Int J Adv Manuf Technol 105(9):1–12
Werrlich S, Daniel A, Ginger A, Nguyen PA, Notni G (2018) Comparing HMD-based and paper-based training[C]// 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE 2018:134–142
Hietanen A, Pieters R, Lanz M, Latokartano J, Kämäräinen J (2020) AR-based interaction for human-robot collaborative manufacturing[J]. Robot Comput Integr Manuf 63:101891
Gergle D, Kraut RE, Fussell SR (2013) Using visual information for grounding and awareness in collaborative tasks. Hum Comput Interact 28(1):1–39
Balakrishnan R. (2007) Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task. In Sigchi Conference on Human Factors in Computing Systems, 1177–1186.
Piumsomboon T, Dey A, Ens B, Lee G, Billinghurst M (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 2019, 6:5.
Piumsomboon T, Lee G A, Hart J D, Ens B, Lindeman R W, Thomas B H, Billinghurst M (2018). Mini-me: an adaptive avatar for mixed reality remote collaboration. In Chi Conference on Human Factors in Computing Systems.46:1-13.
Piumsomboon T, Lee GA, Ens B, Thomas BH, Billinghurst M (2018) Superman vs giant: a study on spatial perception for a multi-scale mixed reality flying telepresence interface. Ieee T Vis Comput Gr 24:2974–2982
Roo J S, Basset J, Cinquin P A, Hachet M. (2018) Understanding users capability to transfer information between mixed and virtual reality: position estimation across modalities and perspectives. CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems April 2018Paper No.: 363 Pages 1–12.
Funk M. (2016) Augmented reality at the workplace a context-aware assistive system using in-situ projection. PHD.
Kraut RE, Fussell SR, Siegel J (2003) Visual information as a conversational resource in collaborative physical tasks. J Vocat Behav 18:13–49
Huang W, Alem L, Tecchia F, Duh HBL (2018) Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration. J Multimodal User Interfaces 12:77–89
Akkil D, Isokoski P (2019) Comparison of gaze and mouse pointers for video-based collaborative physical task. Interact Comput 6:6
Brennan SE, Chen X, Dickinson CA, Neider MB, Zelinsky GJ (2008) Coordinating cognition: the costs and benefits of shared gaze during collaborative search. Cognition, 2008 106(3):1465–1477
D’Angelo S, Begel A. Improving communication between pair programmers using shared gaze awareness[C] CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems May 2017 Pages 6245–6255.
Gupta K, Lee G, Billinghurst M (2016) Do you see what I see? The effect of gaze tracking on task space remote collaboration. Ieee T Vis Comput Gr 22:2413–2422
Wang P, Bai X, Billinghurst M, Zhang S, He W, Han D, Wang Y, Min H, Lan W, Han S (2020) Using a head pointer or eye gaze: the effect of gaze on spatial AR remote collaboration for physical tasks. Interact Comput 32(2):153–169
Kritzler M, Murr M, Michahelles F. (2016) RemoteBob support of on-site workers via a telepresence remote expert system. 6th International Conference on the Internet of Things (IoT‘16), November 07-09, 2016, Stuttgart, Germany.
Anton D, Kurillo G, Yang AY and Bajcsy R. (2017) Augmented telemedicine platform for real-time remote medical consultation. 2017: 77-89.
Günther S, Kratz S , Avrahami D, Mühlhäuser M. (2018) Exploring audio, visual, and tactile cues for synchronous remote assistance[C]// Pervasive Technologies Related to Assistive Environments Conference. 2018:339-344.
Huang W, Kim S, Billinghurst M, Alem L (2018) Sharing hand gesture and sketch cues in remote collaboration. J Vis Commun Image Represent 2018
Wang P, Bai X, Billinghurst M, Zhang S, Wei S, Xu G, He W, Zhang X, Zhang J (2020) 3DGAM: using 3D gesture and CAD models for training on mixed reality remote collaboration. Multimed Tools Appl:1–28
Zhu K, Chen T, Han F, Wu Y S (2019). HapTwist: Creating interactive haptic proxies in virtual reality using low-cost twistable artefacts. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–13). Glasgow, Scotland.
Whitmire E, Benko H, Holz C, Ofek E, Sinclair M (2018) Haptic revolver touch, shear, texture, and shape rendering on a reconfigurable Virtual reality controller. CHI Conference Human Factors Comput Syst 42:54–60. https://doi.org/10.1093/indlaw/dws040
Acknowledgements
We would like to thank all the participants in this experiment; they carried out the experiment and filled in the questionnaire carefully, which brought great help for our research.
Availability of data and materials
All data generated or analyzed during this study are included in this published article.
Funding
This research was financially sponsored by the National Key R&D Program of China (2019YFB1703800) and 111 project (B13044).
Author information
Authors and Affiliations
Contributions
Zenglei Wang finished the draft, and Shusheng Zhang and Xiaoliang Bai improved the paper. Moreover, Xiaoliang Bai designed the MR prototype system.
Corresponding authors
Ethics declarations
Ethics approval
Not applicable.
Consent to participate
All participants consent to participate the experiment.
Consent for publication
All participants consent to publish the related data, and all authors consent to publish the research.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Wang, Z., Zhang, S. & Bai, X. A mixed reality platform for assembly assistance based on gaze interaction in industry. Int J Adv Manuf Technol 116, 3193–3205 (2021). https://doi.org/10.1007/s00170-021-07624-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00170-021-07624-z