Skip to main content
Log in

How are Your Robot Friends Doing? A Design Exploration of Graphical Techniques Supporting Awareness of Robot Team Members in Teleoperation

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

While teleoperated robots continue to proliferate in domains including search and rescue, field exploration, or the military, human error remains a primary cause for accidents or mistakes. One challenge is that teleoperating a remote robot is cognitively taxing as the operator needs to understand the robot’s state and monitor all its sensor data. In a multi-robot team, an operator needs to additionally monitor other robots’ progress, states, notifications, errors, and so on to maintain team cohesion. We conducted a design exploration of novel graphical representations of robot team-member state, to support a person controlling one robot to maintain awareness of other robots in the team. Through a series of evaluations, we examined several design parameters (text, icon, facial expression, use of color, animation, and number of team robots), resulting in a set of guidelines for graphically representing team robot states in the remote team teleoperation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

Similar content being viewed by others

References

  1. Bartram L, Ware C, Calvert T (2003) Moticons: detection, distraction and task. Int J Hum Comput Stud 58(5):515–545. https://doi.org/10.1016/S1071-5819(03)00021-1

    Article  Google Scholar 

  2. Bruemmer DJ, Few DA, Boring RL, Marble JL, Walton MC, Nielsen CW (2005) Shared understanding for collaborative control. IEEE Trans Syst Man Cybern Part A Syst Hum 35(4):494–504. https://doi.org/10.1109/TSMCA.2005.850599

    Article  Google Scholar 

  3. Calhoun G, Warfield L, Wright N, Spriggs S, Ruff H (2012) Automated aid evaluation for transitioning UAS camera views. Proc Hum Factors and Ergon. Soc. Annu. Meet. 54(4):413–417. https://doi.org/10.1177/154193121005400430

    Article  Google Scholar 

  4. Chen Jessie YC, Barnes MJ (2012) Supervisory control of multiple robots in dynamic tasking environments. Ergonomics 55(9):1043–1058. https://doi.org/10.1080/00140139.2012.689013

    Article  Google Scholar 

  5. Chen JYC, Haas EC, Barnes MJ (2007) Human performance issues and user interface design for teleoperated robots. IEEE Trans Syst Man Cybern 37(6):1231–1245. https://doi.org/10.1109/TSMCC.2007.905819

    Article  Google Scholar 

  6. Chen J, Glover M, Li C, Yang C (2016) Development of a user experience enhanced teleoperation approach. In: 2016 International conference on advanced robotics and mechatronics (ICARM). pp 171–177. https://doi.org/10.1109/ICARM.2016.7606914

  7. Cummings M, Bruni S, Mercier S, Mitchell PJ (2007) Automation architecture for single operator, multiple UAV command and control. Int C2 J 1:1–24

    Article  Google Scholar 

  8. Demir M, McNeese NJ, Cooke NJ (2017) Team situation awareness within the context of human-autonomy teaming. Cognit Syst Res 46:3–12. https://doi.org/10.1016/j.cogsys.2016.11.003

    Article  Google Scholar 

  9. Draper M, Calhoun G, Ruff H, Mullins B, Ayala A, Wright N (2008) Transition display aid for changing camera views in UAV operations. In: Proceedings of the first conference on humans operating unmanned systems (HUMOUS’08)

  10. Dubé AK, McEwen RN (2015) Do gestures matter? The implications of using touchscreen devices in mathematics instruction. Learn Instr 40:89–98. https://doi.org/10.1016/j.learninstruc.2015.09.002

    Article  Google Scholar 

  11. Endsley MR (2015) Situation awareness: operationally necessary and scientifically grounded. Cognit Technol Work 17(2):163–167. https://doi.org/10.1007/s10111-015-0323-5

    Article  Google Scholar 

  12. Gittins D (1986) Icon-based human–computer interaction. Int J Man Mach Stud 24(6):519–543. https://doi.org/10.1016/S0020-7373(86)80007-4

    Article  Google Scholar 

  13. Gombolay M, Bair A, Huang C, Shah J (2017) Computational design of mixed-initiative human–robot teaming that considers human factors: situational awareness, workload, and workflow preferences. Int J Robot Res 36(5–7):597–617. https://doi.org/10.1177/0278364916688255

    Article  Google Scholar 

  14. Guo C, Young JE, Sharlin E (2009) Touch and toys: new techniques for interaction with a remote group of robots. In: Proceedings of the 27th international conference on Human factors in computing systems—CHI 09. p 491. https://doi.org/10.1145/1518701.1518780

  15. Hart SG, Staveland LE (1988) Development of NASA-TLX (task load index): results of empirical and theoretical research. Hum Ment Workload. https://doi.org/10.1016/S0166-4115(08)62386-9

    Article  Google Scholar 

  16. Hayes B, Shah JA (2017) Improving robot controller transparency through autonomous policy explanation. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction—HRI’17. pp 303–312. https://doi.org/10.1145/2909824.3020233

  17. Hess U, Hareli S (2015) The role of social context for the interpretation of emotional facial expressions. In: Mandal MK, Awasthi A (eds) Understanding facial expressions in communication. Springer India, New Delhi, pp 119–141. https://doi.org/10.1007/978-81-322-1934-7_7

    Chapter  Google Scholar 

  18. Jiang X, Zheng B, Bednarik R, Stella Atkins M (2015) Pupil responses to continuous aiming movements. Int J Hum Comput Stud 83:1–11. https://doi.org/10.1016/j.ijhcs.2015.05.006

    Article  Google Scholar 

  19. Kline TJB, Ghali LM, Kline DW, Brown S (1990) Visibility distance of highway signs among young, middle-aged, and older observers: icons are better than text. Hum Factors J Hum Factors Ergon Soc 32(5):609–619. https://doi.org/10.1177/001872089003200508

    Article  Google Scholar 

  20. Lee D, Franchi A, Son HI, Ha C, Bulthoff HH, Giordano PR (2013) Semiautonomous haptic teleoperation control architecture of multiple unmanned aerial vehicles. IEEE/ASME Trans Mechatron 18(4):1334–1345. https://doi.org/10.1109/TMECH.2013.2263963

    Article  Google Scholar 

  21. Long GM, Kearns DF (1996) Visibility of text and icon highway signs under dynamic viewing conditions. Hum Factors J Hum Factors Ergon Soc 38(4):690–701. https://doi.org/10.1518/001872096778827215

    Article  Google Scholar 

  22. Murch GM (1985) Using color effectively: designing to human specifications. Tech Commun 32(4):14–20

    Google Scholar 

  23. O’Keeffe S, Ward TE, Villing R (2016) Improving task performance through high level shared control of multiple robots with a context aware human–robot interface. In: 2016 International conference on autonomous robot systems and competitions (ICARSC). pp 277–282. https://doi.org/10.1109/ICARSC.2016.45

  24. Omidshafiei S, Agha-Mohammadi A-K, Amato C, Liu S-Y, How JP, Vian J (2017) Decentralized control of multi-robot partially observable Markov decision processes using belief space macro-actions. Int J Robot Res 36(2):231–258. https://doi.org/10.1177/0278364917692864

    Article  Google Scholar 

  25. Parkinson B (2005) Do facial movements express emotions or communicate motives? Personal Soc Psychol Rev 9(4):278–311. https://doi.org/10.1207/s15327957pspr0904_1

    Article  Google Scholar 

  26. Paul C, Komlodi A (2012) Emotion as an indicator for future interruptive notification experiences. In: Proceedings of the 2012 ACM annual conference extended abstracts on human factors in computing systems extended abstracts—CHI EA’12. vol 2003. https://doi.org/10.1145/2212776.2223743

  27. Phillips EK, Jentsch FG (2017) Supporting situation awareness through robot-to-human information exchanges under conditions of visuospatial perspective taking. J Hum Robot Interact 6(3):92–117. https://doi.org/10.5898/JHRI.6.3.Phillips

    Article  Google Scholar 

  28. Price TF, LaFiandra M (2017) The perception of team engagement reduces stress induced situation awareness overconfidence and risk-taking. Cognit Syst Res 46:52–60. https://doi.org/10.1016/j.cogsys.2017.02.004

    Article  Google Scholar 

  29. Rea DJ, Seo SH, Bruce N, Young JE (2017) Movers, Shakers, and Those who stand still: visual attention-grabbing techniques in robot teleoperation. In: Proceedings of the 2017 ACM/IEEE international conference on human–robot interaction—HRI’17. pp 398–407. https://doi.org/10.1145/2909824.3020246

  30. Rosenfeld A, Agmon N, Maksimov O, Kraus S (2017) Intelligent agent supporting human–multi-robot team collaboration. Artif Intell 252:211–231. https://doi.org/10.1016/j.artint.2017.08.005

    Article  MathSciNet  MATH  Google Scholar 

  31. Selkowitz AR, Lakhmani SG, Chen JYC (2017) Using agent transparency to support situation awareness of the autonomous squad member. Cognit Syst Res 46:13–25. https://doi.org/10.1016/j.cogsys.2017.02.003

    Article  Google Scholar 

  32. Seo SH, Geiskkovitch D, Nakane M, King C, Young JE (2015) Poor thing! would you feel sorry for a simulated robot? In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction—HRI’15. pp 125–132. https://doi.org/10.1145/2696454.2696471

  33. Seo SH, Rea DJ, Wiebe J, Young JE (2017) Monocle: interactive detail-in-context using two pan-and-tilt cameras to improve teleoperation effectiveness. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). pp 962–967. https://doi.org/10.1109/ROMAN.2017.8172419

  34. Seo SH, Young JE, Irani P (2017) Where are the robots? In-feed embedded techniques for visualizing robot team member locations. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN). pp 522–527. https://doi.org/10.1109/ROMAN.2017.8172352

  35. Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R (2013) Communicating affect via flight path exploring use of the Laban effort system for designing affective locomotion paths. In: 2013 8th ACM/IEEE international conference on human–robot interaction (HRI). pp 293–300. https://doi.org/10.1109/HRI.2013.6483602

  36. Singh A, Young JE (2013) A dog tail for utility robots: exploring affective properties of tail movement. Hum Comput Interact 8118:403–419. https://doi.org/10.1007/978-3-642-40480-1_27

    Article  Google Scholar 

  37. Sternberg S (1969) Memory-scanning: mental processes revealed by reaction-time experiments. Am Sci 57(4):421–457

    Google Scholar 

  38. Van Gerven PWM, Paas F, Van Merriënboer JJG, Schmidt HG (2004) Memory load and the cognitive pupillary response in aging. Psychophysiology 41(2):167–174. https://doi.org/10.1111/j.1469-8986.2003.00148.x

    Article  Google Scholar 

  39. Yang J, Kamezaki M, Sato R, Iwata H, Sugano S (2015) Inducement of visual attention using augmented reality for multi-display systems in advanced tele-operation. In: 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS). pp 5364–5369. https://doi.org/10.1109/IROS.2015.7354135

  40. Zheng K, Glas DF, Kanda T, Ishiguro H, Hagita N (2013) Supervisory control of multiple social robots for navigation. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction. pp 17–24. https://doi.org/10.1109/HRI.2013.6483497

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stela H. Seo.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Authors retain copyright and grant the International Journal of Social Robotics right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.

Appendices

Appendix: State Inquiry Questionnaire (All Text Options): One Team Robot/Text Representation

figure a

Appendix: State Inquiry Questionnaire (All Text Options): One Team Robot/Text Representation

figure b

Appendix: State Inquiry Questionnaire (Matching Options): One Team Robot/Icon Representation

figure c

Appendix: State Inquiry Questionnaire (Matching Options): Two Team Robot/Icon Representation

figure d

Appendix: State Inquiry Questionnaire (Matching Options): One Team Robot/Emoji Representation

figure e

Appendix: State Inquiry Questionnaire (Matching Options): Two Team Robot/Emoji Representation

figure f

Appendix: State Inquiry Questionnaire (Simpler Version): Left Team Robot/Text Representation

figure g

Appendix: State Inquiry Questionnaire (Simpler Version): Left Team Robot/Icon Representation

figure h

Appendix: State Inquiry Questionnaire (Simpler Version): Left Team Robot/Emoji Representation

figure i

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Seo, S.H., Young, J.E. & Irani, P. How are Your Robot Friends Doing? A Design Exploration of Graphical Techniques Supporting Awareness of Robot Team Members in Teleoperation. Int J of Soc Robotics 13, 725–749 (2021). https://doi.org/10.1007/s12369-020-00670-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-020-00670-9

Keywords

Navigation