Skip to main content
Log in

Sharing gaze rays for visual target identification tasks in collaborative augmented reality

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Augmented reality (AR) technologies provide a shared platform for users to collaborate in a physical context involving both real and virtual content. To enhance the quality of interaction between AR users, researchers have proposed augmenting users’ interpersonal space with embodied cues such as their gaze direction. While beneficial in achieving improved interpersonal spatial communication, such shared gaze environments suffer from multiple types of errors related to eye tracking and networking, that can reduce objective performance and subjective experience. In this paper, we present a human-subjects study to understand the impact of accuracy, precision, latency, and dropout based errors on users’ performance when using shared gaze cues to identify a target among a crowd of people. We simulated varying amounts of errors and the target distances and measured participants’ objective performance through their response time and error rate, and their subjective experience and cognitive load through questionnaires. We found significant differences suggesting that the simulated error levels had stronger effects on participants’ performance than target distance with accuracy and latency having a high impact on participants’ error rate. We also observed that participants assessed their own performance as lower than it objectively was. We discuss implications for practical shared gaze applications and we present a multi-user prototype system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. https://pupil-labs.com/.

  2. https://assetstore.unity.com/.

  3. https://www.mixamo.com/.

  4. https://www.tobii.com/.

  5. https://pupil-labs.com/products/vr-ar/.

  6. https://github.com/pupil-labs/hmd-eyes.

  7. https://www.photonengine.com/pun.

References

  1. Barz M, Bulling A, Daiber F (2015) Computational modelling and prediction of gaze estimation error for head-mounted eye trackers. DFKI Res Rep 1(1):1–10

    Google Scholar 

  2. Bauer M, Kortuem G, Segall Z (1999) “Where are you pointing at?” A study of remote collaboration in a wearable videoconference system. In: Digest of papers. Third international symposium on wearable computers, pp 151–158. IEEE

  3. Brennan SE, Chen X, Dickinson CA, Neider MB, Zelinsky GJ (2008) Coordinating cognition: the costs and benefits of shared gaze during collaborative search. Cognition 106(3):1465–1477

    Article  Google Scholar 

  4. Brooke J (1996) SUS: a quick and dirty usability scale. Usabil Eval Ind 189(194):4–7

    Google Scholar 

  5. Cerrolaza JJ, Villanueva A, Villanueva M, Cabeza R (2012) Error characterization and compensation in eye tracking systems. In: Proceedings of the symposium on eye tracking research and applications, pp 205–208. ACM

  6. Conner B, Holden L (1997) Providing a low latency user experience in a high latency application

  7. Drewes J, Masson GS, Montagnini A (2012) Shifts in reported gaze position due to changes in pupil size: ground truth and compensation. In: Proceedings of the symposium on eye tracking research and applications, pp 209–212. ACM

  8. Ellis SR, Breant F, Manges B, Jacoby R, Adelstein BD (1997) Factors influencing operator interaction with virtual objects viewed via head-mounted see-through displays: viewing conditions and rendering latency. In: Proceedings of IEEE annual international symposium on virtual reality, pp 138–145. IEEE

  9. Erickson A, Norouzi N, Kim K, LaViola JJ Jr, Bruder G, Welch GF (2020) Understanding the effects of depth information in shared gaze augmented reality environments. In: IEEE transactions on visualization and computer graphics

  10. Feit AM, Williams S, Toledo A, Paradiso A, Kulkarni H, Kane S, Morris MR (2017) Toward everyday gaze input: accuracy and precision of eye tracking and implications for design. In: Proceedings of the chi conference on human factors in computing systems, pp 1118–1130. ACM

  11. Fitzpatrick K, Brewer MA, Turner S (2006) Another look at pedestrian walking speed. Transp Res Rec 1982(1):21–29

    Article  Google Scholar 

  12. Geelhoed E, Parker A, Williams DJ, Groen M (2009) Effects of latency on telepresence. Technical report HPL-2009-120, HP Laboratories

  13. Gupta K, Lee GA, Billinghurst M (2016) Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE Trans Visual Comput Graph 22(11):2413–2422

    Article  Google Scholar 

  14. Hall ET (1959) The silent language, vol 948. Anchor Books, New York

    Google Scholar 

  15. Hart SG, Staveland LE (1988) Development of NASA-TLX (task load index): results of empirical and theoretical research. In: Advances in psychology, vol. 52, pp 139–183. Elsevier, Amsterdam

  16. Holmqvist K, Nyström M, Mulvey F (2012) Eye tracker data quality: what it is and how to measure it. In: Proceedings of the Symposium on Eye Tracking Research and Applications, pp 45–52. ACM

  17. Hornof AJ, Halverson T (2002) Cleaning up systematic error in eye-tracking data by using required fixation locations. Behav Res Methods Instrum Comput 34(4):592–604

    Article  Google Scholar 

  18. Jörg S, Normoyle A, Safonova A (2012) How responsiveness affects players’ perception in digital games. In: Proceedings of the ACM symposium on applied perception, pp 33–38. ACM

  19. Kim K, Billinghurst M, Bruder G, Duh HBL, Welch GF (2018) Revisiting trends in augmented reality research: a review of the 2nd decade of ismar (2008–2017). IEEE Trans Visual Comput Graph (TVCG) 24(11):2947–2962

    Article  Google Scholar 

  20. Kim K, Nagendran A, Bailenson J, Welch G (2015) Expectancy violations related to a virtual human’s joint gaze behavior in real-virtual human interactions. In: Proceedings of international conference on computer animation and social agents, pp 5–8

  21. Kiyokawa K, Takemura H, Yokoya N (1999) A collaboration support technique by integrating a shared virtual reality and a shared augmented reality. In: IEEE proceedings of the international conference on systems, man, and cybernetics (Cat. No. 99CH37028), vol 6, pp 48–53. IEEE

  22. Koilias A, Mousas C, Anagnostopoulos CN (2019) The effects of motion artifacts on self-avatar agency. Informatics 6(2):18

    Article  Google Scholar 

  23. Langton SR, Watt RJ, Bruce V (2000) Do the eyes have it? Cues to the direction of social attention. Trends Cogn Sci 4(2):50–59

    Article  Google Scholar 

  24. Lee C, Bonebrake S, Bowman DA, Höllerer T (2010) The role of latency in the validity of AR simulation. In: IEEE virtual reality conference (VR), pp 11–18

  25. Li Y, Lu F, Lages WS, Bowman D (2019) Gaze direction visualization techniques for collaborative wide-area model-free augmented reality. In: Symposium on spatial user interaction, pp 1–11

  26. Mcknight DH, Carter M, Thatcher JB, Clay PF (2011) Trust in a specific technology: an investigation of its components and measures. ACM Trans Manag Inf Syst 2(2):12

    Article  Google Scholar 

  27. Murray N, Roberts D, Steed A, Sharkey P, Dickerson P, Rae J (2007) An assessment of eye-gaze potential within immersive virtual environments. ACM Trans Multimedia Comput Commun Appl 3(4):17

    Article  Google Scholar 

  28. Norouzi N, Erickson A, Kim K, Schubert R, LaViola Jr, JJ, Bruder G, Welch GF (2019) Effects of shared gaze parameters on visual target identification task performance in augmented reality. In: Proceedings of the ACM symposium on spatial user interaction (SUI), pp 12:1–12:11

  29. Nyström M, Andersson R, Holmqvist K, Van De Weijer J (2013) The influence of calibration method and eye physiology on eyetracking data quality. Behav Res Methods 45(1):272–288

    Article  Google Scholar 

  30. Ooms K, Dupont L, Lapon L, Popelka S (2015) Accuracy and precision of fixation locations recorded with the low-cost eye tribe tracker in different experimental setups. J Eye Move Res 8(1):1–20

    Google Scholar 

  31. Pavlovych A, Stuerzlinger W (2011) Target following performance in the presence of latency, jitter, and signal dropouts. In: Proceedings of Graphics Interface. Canadian Human–Computer Communications Society, pp 33–40

  32. Piumsomboon T, Day A, Ens B, Lee Y, Lee G, Billinghurst M (2017) Exploring enhancements for remote mixed reality collaboration. In: ACM SIGGRAPH Asia mobile graphics and interactive applications

  33. Piumsomboon T, Lee Y, Lee G, Billinghurst M (2017) Covar: a collaborative virtual and augmented reality system for remote collaboration. In: SIGGRAPH Asia 2017 emerging technologies. ACM

  34. Piumsomboon T, Lee Y, Lee GA, Dey A, Billinghurst M (2017) Empathic mixed reality: sharing what you feel and interacting with what you see. In: International symposium on ubiquitous virtual reality, pp 38–41. IEEE

  35. Ragan E, Wilkes C, Bowman DA, Hollerer T (2009) Simulation of augmented reality systems in purely virtual environments. In: IEEE virtual reality conference, pp 287–288

  36. Schoenenberg K (2016) The quality of mediated-conversations under transmission delay. Ph.D. thesis, TU Berlin

  37. Steinicke F, Ropinski T, Hinrichs K (2006) Object selection in virtual environments using an improved virtual pointer metaphor. In: Computer vision and graphics. Springer, Berlin, pp 320–326

  38. Toothman N, Neff M (2019) The impact of avatar tracking errors on user experience in VR. In: Proceedings of IEEE virtual reality (VR), pp 1–11

  39. Velloso E, Carter M, Newn J, Esteves A, Clarke C, Gellersen H (2017) Motion correlation: selecting objects by matching their movement. ACM Trans Comput Hum Interact 24(3):35

    Article  Google Scholar 

  40. Waltemate T, Senna I, Hülsmann F, Rohde M, Kopp S, Ernst M, Botsch M (2016) The impact of latency on perceptual judgments and motor performance in closed-loop interaction in virtual reality. In: Proceedings of the ACM conference on virtual reality software and technology, pp 27–35

  41. Welch G, Bruder G, Squire P, Schubert R (2019) Anticipating widespread augmented reality: insights from the 2018 AR visioning workshop. Technical report, University of Central Florida and Office of Naval Research

  42. Zhang Y, Pfeuffer K, Chong MK, Alexander J, Bulling A, Gellersen H (2017) Look together: using gaze for assisting co-located collaborative search. Pers Ubiquit Comput 21(1):173–186

    Article  Google Scholar 

Download references

Funding

This material includes work supported in part by the Office of Naval Research under Award Number N00014-17-1-2927 (Dr. Peter Squire, Code 34); the National Science Foundation under Collaborative Award Number 1800961 (Dr. Ephraim P. Glinert, IIS); and the AdventHealth Endowed Chair in Healthcare Simulation (Prof. Welch). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the supporting institutions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Austin Erickson.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Erickson, A., Norouzi, N., Kim, K. et al. Sharing gaze rays for visual target identification tasks in collaborative augmented reality. J Multimodal User Interfaces 14, 353–371 (2020). https://doi.org/10.1007/s12193-020-00330-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-020-00330-2

Keywords

Navigation