Skip to main content
Log in

People Do not Automatically Take the Level-1 Visual Perspective of Humanoid Robot Avatars

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Taking the perspective of others is critical for both human–human and human–robot interactions. Previous studies using the dot perspective task have revealed that people could automatically process what other people can see. In this study, following the classical dot perspective task, we showed that Chinese participants could not automatically process humanoid robot avatars’ perspective when only judging from self-perspective (Experiment 1) or randomly judging between self and avatar’s perspectives (Experiment 2), and people’s anthropomorphism tendency was related to the efficiency but not the automaticity of perspective-taking. These results revealed that human–human and human–robot interactions might be different in the basic visual process, and suggested the anthropomorphism tendency in people as an influential factor in human–robot interaction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data Availability

Data and materials are available upon request from the first author.

Code Availability

Not applicable.

References

  1. Broadbent E (2017) Interactions with robots: the truths we reveal about ourselves. Annu Rev Psychol 68:627–652. https://doi.org/10.1146/annurev-psych-010416-043958

    Article  Google Scholar 

  2. Trafton JG, Cassimatis NL, Bugajska MD, Brock DP, Mintz FE, Schultz AC (2005) Enabling effective human–robot interaction using perspective-taking in robots. IEEE Trans Syst Man Cybern Part A Syst Hum 35:460–470. https://doi.org/10.1109/TSMCA.2005.850592

    Article  Google Scholar 

  3. Surtees A, Samson D, Apperly I (2016) Unintentional perspective-taking calculates whether something is seen, but not how it is seen. Cognition 148:97–105. https://doi.org/10.1016/j.cognition.2015.12.010

    Article  Google Scholar 

  4. Flavell JH, Everett BA, Croft K, Flavell ER (1981) Young children’s knowledge about visual perception: further evidence for the level 1-level 2 distinction. Dev Psychol 17:99–103. https://doi.org/10.1037/0012-1649.17.1.99

    Article  Google Scholar 

  5. Samson D, Apperly IA, Braithwaite JJ, Andrews BJ, Scott SEB (2010) Seeing it their way: evidence for rapid and involuntary computation of what other people see. J Exp Psychol Hum Percept Perform 36:1255–1266. https://doi.org/10.1037/a0018729

    Article  Google Scholar 

  6. Nielsen MK, Slade L, Levy JP, Holmes A (2015) Inclined to see it your way: Do altercentric intrusion effects in visual perspective taking reflect an intrinsically social process? Q J Exp Psychol 68:1931–1951. https://doi.org/10.1080/17470218.2015.1023206

    Article  Google Scholar 

  7. Todd AR, Cameron CD, Simpson AJ (2017) Dissociating processes underlying level-1 visual perspective taking in adults. Cognition 159:97–101. https://doi.org/10.1016/j.cognition.2016.11.010

    Article  Google Scholar 

  8. Todd AR, Simpson AJ (2016) Anxiety impairs spontaneous perspective calculation: evidence from a level-1 visual perspective-taking task. Cognition 156:88–94. https://doi.org/10.1016/j.cognition.2016.08.004

    Article  Google Scholar 

  9. Schurz M, Kronbichler M, Weissengruber S, Surtees A, Samson D, Perner J (2015) Clarifying the role of theory of mind areas during visual perspective taking: issues of spontaneity and domain-specificity. NeuroImage 117:386–396. https://doi.org/10.1016/j.neuroimage.2015.04.031

    Article  Google Scholar 

  10. Hu YB, Li ZJ, Li GL, Yuan PJ, Yang CG, Song R (2017) Development of sensory-motor fusion-based manipulation and grasping control for a robotic hand-eye system. IEEE Trans Syst Man Cybern Syst 47(7):1169–1180. https://doi.org/10.1109/tsmc.2016.2560530

    Article  Google Scholar 

  11. Su H, Sandoval J, Makhdoomi M, Ferrigno G, De Momi E (2018) Safety-enhanced human–robot interaction control of redundant robot for teleoperated minimally invasive surgery. In: 2018 IEEE international conference on robotics and automation. ICRA. Ieee Computer Soc, Los Alamitos, pp 6611–6616

  12. Fischer T, Demiris Y (2016) Markerless perspective taking for humanoid robots in unconstrained environments. In: IEEE international conference on robotics and automation, Stockholm, Sweden, 2016. https://doi.org/10.1109/ICRA.2016.7487504

  13. Breazeal C, Berlin M, Brooks A, Gray J, Thomaz AL (2006) Using perspective taking to learn from ambiguous demonstrations. Robot Auton Syst 54:385–393. https://doi.org/10.1016/j.robot.2006.02.004

    Article  Google Scholar 

  14. Pandey AK, Ali M, Alami R (2013) Towards a task-aware proactive sociable robot based on multi-state perspective-taking. Int J Social Robot 5:215–236. https://doi.org/10.1007/s12369-013-0181-3

    Article  Google Scholar 

  15. Fischer T, Demiris Y (2020) Computational modeling of embodied visual perspective taking. IEEE Trans Cogn Dev Syst 12(4):723–732. https://doi.org/10.1109/tcds.2019.2949861

    Article  Google Scholar 

  16. Schrodt F, Layher G, Neumann H, Butz MV (2015) Embodied learning of a generative neural model for biological motion perception and inference. Front Comput Neurosci https://doi.org/10.3389/fncom.2015.00079

    Article  Google Scholar 

  17. Zhao X, Cusimano C, Malle BF (2016) Do people spontaneously take a robot’s visual perspective? In: HRI ’16: The eleventh ACM/IEEE international conference on human robot interaction, 2016. pp 335–342

  18. Li S, Scalise R, Admoni H, Rosenthal S, Srinivasa SS (2016) Spatial references and perspective in natural language instructions for collaborative manipulation. Paper presented at the 25th IEEE international symposium on robot and human interactive communication, New York City

  19. MacDorman KF, Srinivas P, Patel H (2013) The uncanny valley does not interfere with level 1 visual perspective taking. Comput Hum Behav 29:1671–1685. https://doi.org/10.1016/j.chb.2013.01.051

    Article  Google Scholar 

  20. Moratz R, Fischer K, Tenbrink T (2001) Cognitive modeling of spatial reference for human–robot interaction. Int J Artif Intell Tools 10:589–611. https://doi.org/10.1142/S0218213001000672

    Article  Google Scholar 

  21. Fischer K (2006) The role of users’ concepts of the robot in human–robot spatial instruction. In: Spatial cognition V reasoning, action, interaction. Springer, Berlin, pp 76–89. https://doi.org/10.1007/978-3-540-75666-8_5

    Chapter  Google Scholar 

  22. Carlson L, Skubic M, Miller J, Huo Z, Alexenko T (2014) Strategies for human-driven robot comprehension of spatial descriptions by older adults in a robot fetch task. Topics Cogn Sci 6:513–533. https://doi.org/10.1111/tops.12101

    Article  Google Scholar 

  23. Galati A, Avraamides MN (2015) Social and representational cues jointly influence spatial perspective-taking. Cogn Sci 39:739–765. https://doi.org/10.1111/cogs.12173

    Article  Google Scholar 

  24. Tversky B, Lee P, Mainwaring S (1999) Why do speakers mix perspectives? Spatial Cogn Comput 1:399–412. https://doi.org/10.1023/a:1010091730257

    Article  Google Scholar 

  25. Clark HH, Wilkes-Gibbs D (1986) Referring as a collaborative process. Cognition 22:1–39. https://doi.org/10.1016/0010-0277(86)90010-7

    Article  Google Scholar 

  26. Conway JR, Lee D, Ojaghi M, Catmur C, Bird G (2017) Submentalizing or mentalizing in a level 1 perspective-taking task: a cloak and goggles test. J Exp Psychol Hum Percept Perform 43:454–465. https://doi.org/10.1037/xhp0000319

    Article  Google Scholar 

  27. Santiesteban I, Kaurb S, Bird G, Catmur C (2017) Attentional processes, not implicit mentalizing, mediate performance in a perspective-taking task: evidence from stimulation of the temporoparietal junction. NeuroImage 155:305–311. https://doi.org/10.1016/j.neuroimage.2017.04.055

    Article  Google Scholar 

  28. Furlanetto T, Becchio C, Samson D, Apperly I (2016) Altercentric interference in level 1 visual perspective taking reflects the ascription of mental states, not submentalizing. J Exp Psychol Hum Percept Perform 42:158–163. https://doi.org/10.1037/xhp0000138

    Article  Google Scholar 

  29. Krach S, Hegel F, Wrede B, Sagerer G, Binkofski F, Kircher T (2008) Can machines think? Interaction and perspective taking with robots investigated via fMRI. PLoS ONE 3:e2597. https://doi.org/10.1371/journal.pone.0002597

    Article  Google Scholar 

  30. Cole GG, Millett AC (2019) The closing of the theory of mind: a critique of perspective-taking. Psychon Bull Rev 26(6):1787–1802. https://doi.org/10.3758/s13423-019-01657-y

    Article  Google Scholar 

  31. Cole GG, Millett AC, Samuel S, Eacott MJ (2020) Perspective-Taking: In Search of a Theory. Vision (Basel Switzerland) 4 (2). https://doi.org/10.3390/vision4020030

  32. Bartneck C, Kulić D, Croft E, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Social Robot 1:71–81. https://doi.org/10.1007/s12369-008-0001-3

    Article  Google Scholar 

  33. Severson RL, Lemm KM (2016) Kids see human too: adapting an individual differences measure of anthropomorphism for a child sample. J Cogn Dev 17:122–141. https://doi.org/10.1080/15248372.2014.989445

    Article  Google Scholar 

  34. Waytz A, Cacioppo J, Epley N (2010) Who sees human? The stability and importance of individual differences in anthropomorphism. Perspect Psychol Sci 5:219–232. https://doi.org/10.1177/1745691610369336

    Article  Google Scholar 

  35. Xiao C, Xu L, Sui Y, Zhou R (2021) Do people regard robots as human-like social partners? Evidence from perspective-taking in spatial descriptions. Front Psychol 11:578244. https://doi.org/10.3389/fpsyg.2020.578244

    Article  Google Scholar 

  36. Michael J, Wolf T, Letesson C, Butterfill S, Skewes J, Hohwy J (2018) Seeing it both ways: using a double-cuing task to investigate the role of spatial cuing in level-1 visual perspective-taking. J Exp Psychol Hum Percept Perform 44:693–702. https://doi.org/10.1037/xhp0000486

    Article  Google Scholar 

  37. Bukowski H, Hietanen JK, Samson D (2015) From gaze cueing to perspective taking: revisiting the claim that we automatically compute where or what other people are looking at. Visual Cognition 23:1020–1042. https://doi.org/10.1080/13506285.2015.1132804

    Article  Google Scholar 

  38. Townsend JT, Ashby FG (1978) Methods of modeling capacity in simple processing systems. In: John Castellan N, Restle J F (eds) Cognitive theory, vol III. Lawrence Erlbaum Associates, Hillsdale, New Jersey, pp 199–239

    Google Scholar 

  39. Bruyer R, Brysbaert M (2011) Combining speed and accuracy in cognitive psychology: Is the inverse efficiency score (IES) a better dependent variable than the mean reaction time (RT) and the percentage of errors (PE)? Psychol Belgica 51:5–13. https://doi.org/10.5334/pb-51-1-5

    Article  Google Scholar 

  40. Heyes C (2014) Submentalizing: I am not really reading your mind. Perspect Psychol Sci 9:131–143. https://doi.org/10.1177/1745691613518076

    Article  Google Scholar 

  41. Fan Y, Xiao C (2020) [The role of attention in spontaneous visual perspective-taking]. Unpublished raw data. Nanjing University

  42. Bukowski H, Samson D (2017) New Insights into the inter-individual variability in perspective taking. Vision (Basel Switzerland) 1 (1). https://doi.org/10.3390/vision1010008

  43. Kessler K, Cao L, O’Shea KJ, Wang H (2014) A cross-culture, cross-gender comparison of perspective taking mechanisms. Proc R Soc B Biol Sci 281:20140388. https://doi.org/10.1098/rspb.2014.0388

  44. Tversky B, Chow T (2017) Language and culture in visual narratives. Cogn Semiot 10:77–89. https://doi.org/10.1515/cogsem-2017-0008

    Article  Google Scholar 

  45. Chua HF, Boland JE, Nisbett RE (2005) Cultural variation in eye movements during scene perception. Proc Natl Acad Sci U S A 102(35):12629–12633. https://doi.org/10.1073/pnas.0506162102

    Article  Google Scholar 

  46. Cole GG, Atkinson M, Le ATD, Smith DT (2016) Do humans spontaneously take the perspective of others? Acta Physiol (Oxf) 164:165–168. https://doi.org/10.1016/j.actpsy.2016.01.007

    Article  Google Scholar 

  47. Złotowski J, Proudfoot D, Yogeeswaran K, Bartneck C (2015) Anthropomorphism: opportunities and challenges in human–robot interaction. Int J Social Robot 7:347–360. https://doi.org/10.1007/s12369-014-0267-6

    Article  Google Scholar 

  48. Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42:177–190. https://doi.org/10.1016/S0921-8890(02)00374-3

    Article  MATH  Google Scholar 

  49. Ferguson HJ, Brunsdon VEA, Bradford EEF (2018) Age of avatar modulates the altercentric bias in a visual perspective-taking task: ERP and behavioral evidence. Cogn Affect Behav Neurosci 18(6):1298–1319. https://doi.org/10.3758/s13415-018-0641-1

    Article  Google Scholar 

  50. Park B, Knörzer L, Plass JL, Brünken R (2015) Emotional design and positive emotions in multimedia learning: an eyetracking study on the use of anthropomorphisms. Comput Educ 86:30–42. https://doi.org/10.1016/j.compedu.2015.02.016

    Article  Google Scholar 

  51. Stárková T, Lukavský J, Javora O, Brom C (2019) Anthropomorphisms in multimedia learning: attract attention but do not enhance learning? J Comput Assist Learn 35:555–568. https://doi.org/10.1111/jcal.12359

    Article  Google Scholar 

  52. Takahashi K, Watanabe K (2015) Seeing objects as faces enhances object detection. i-Perception 6:1–14. https://doi.org/10.1177/2041669515606007

    Article  Google Scholar 

  53. Erle TM, Topolinski S (2017) The grounded nature of psychological perspective-taking. J Pers Soc Psychol 112:683–695. https://doi.org/10.1037/pspa0000081

    Article  Google Scholar 

  54. Zanatto D, Patacchiola M, Cangelosi A, Goslin J (2020) Generalisation of anthropomorphic stereotype. Int J Social Robot 12:163–172. https://doi.org/10.1007/s12369-019-00549-4

    Article  Google Scholar 

  55. Bartneck C, Bleeker T, Bun J, Fens P, Riet L (2010) The influence of robot anthropomorphism on the feelings of embarrassment when interacting with robots. J Behav Robot 1:109–115. https://doi.org/10.2478/s13230-010-0011-3

    Article  Google Scholar 

Download references

Funding

This study was funded by the Major Projects of Philosophy and Social Science Research in Jiangsu Universities (Grant Number 2018SJZDA020), and the Fourth Pilot-research Program for Human Spaceflight of China (Grant Number 030602).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Chengli Xiao or Renlai Zhou.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xiao, C., Fan, Y., Zhang, J. et al. People Do not Automatically Take the Level-1 Visual Perspective of Humanoid Robot Avatars. Int J of Soc Robotics 14, 165–176 (2022). https://doi.org/10.1007/s12369-021-00773-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-021-00773-x

Keywords

Navigation