Skip to main content
Log in

How Movements of a Non-Humanoid Robot Affect Emotional Perceptions and Trust

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

The way that we move often carries emotional meaning that the people with whom we interact are adept at detecting. Humans treat the movements of robots similarly, attributing the same emotions when the robots move in ways that are analogous to emotionally-charged human movements. However, this HRI work has primarily been done on humanoid or animal-shaped robots. In this paper, we examined whether this effect would hold when people observed the movements of a non-humanoid robot, Cozmo. Moreover, the attribution of emotional stance to another agent is key in the process of predicting the behavior of the other (Eivers AR et al. in Br J Dev Psychol 28:499–504, 2010). This process is laid bare in transactional scenarios where the predicted level of trust guides the humans behavior. The ultimatum game is a transactional framework that we have adapted to allow us to test in stages how humans predict and react to the behavior of the robot. We performed a study in which people played two rounds of the ultimatum game with a non-humanoid robot that moved in either a positive or negative manner. We found that in both rounds people in the Positive Movement condition rated Cozmo’s emotional valence as higher than those in the Negative Movement condition. In the second round, after Cozmo had responded to the first offers that the participants made, Cozmo’s bid response was a significant factor in the Positive Movement condition, in which participants whose first bids were rejected by Cozmo rated its emotional valence as lower than those whose bids were accepted by Cozmo. There was not an effect of movement on trust. We also ran a series of exploratory analyses to explore how various factors affected participants’ reasonings about Cozmo’s behavior, and found that unexpected, non-social behaviors (such as moving in a negative manner or rejecting a participant’s offer) lead to an increase in anthropomorphic behavior explanations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Appel M , Weber S, Krause S, Mara M (2016)On the eeriness of service robots with emotional capabilities. In: 2016 11th ACM/IEEE International conference on human–robot interaction (HRI), pp 411–412. IEEE

  2. Aslam S , Standen PJ, Shopland N, Burton A, Brown D (2016) A comparison of humanoid and non-humanoid robots in supporting the learning of pupils with severe intellectual disabilities. In: 2016 International conference on interactive technologies and games (ITAG), pp 7–12. IEEE

  3. Atkinson Anthony P, Dittrich Winand H, Gemmell Andrew J, Young Andrew W (2004) Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33(6):717–746

    Google Scholar 

  4. Barrett LF (2017) How emotions are made: the secret life of the brain. Houghton Mifflin Harcourt, Boston

    Google Scholar 

  5. Beck A , Cañamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: 19th International symposium in robot and human interactive communication, pp 464–469. IEEE

  6. Beck A, Cañamero L, Hiolle A, Damiano L, Cosi P, Tesser F, Sommavilla G (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5(3):325–334

    Google Scholar 

  7. Belkin LY, Rothman NB (2017) Do i trust you? depends on what you feel: interpersonal effects of emotions on initial trust at zero-acquaintance. Negot Confl Manag Res 10(1):3–27

    Google Scholar 

  8. Boone RT, Buck R (2003) Emotional expressivity and trustworthiness: the role of nonverbal behavior in the evolution of cooperation. J Nonverbal Behav 27(3):163–182

    Google Scholar 

  9. Boone RT, Cunningham JG (1998) Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34(5):1007

    Google Scholar 

  10. Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59(1–2):119–155

    Google Scholar 

  11. Bronson Gordon W (1968) The fear of novelty. Psychol Bull 69(5):350

    Google Scholar 

  12. Brooks RA , Breazeal C, Marjanović M, Scassellati B, Williamson MM (1998) The cog project: Building a humanoid robot. In: International workshop on computation for metaphors, analogy, and agents, pp 52–87. Springer

  13. Cañamero L, Fredslund J (2001) I show you how i like you-can you read it in my face?[robotics]. IEEE Trans Syst Man Cybern-Part A: Syst Hum 31(5):454–459

    Google Scholar 

  14. Castellano G , Villalba SD, Camurri A (2007)Recognising human emotions from body movement and gesture dynamics. In: International conference on affective computing and intelligent interaction, pp 71–82. Springer

  15. Clarke Tanya J, Bradshaw Mark F, Field David T, Hampson Sarah E, David R (2005) The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34(10):1171–1180

    Google Scholar 

  16. Coulson M (2004) Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J Nonverbal Behav 28(2):117–139

    MathSciNet  Google Scholar 

  17. Crawley Jacqueline N (1985) Exploratory behavior models of anxiety in mice. Neurosci Biobehav Rev 9(1):37–44

    Google Scholar 

  18. Culley KE, Madhavan P (2013) A note of caution regarding anthropomorphism in hci agents. Comput Human Behav 29(3):577–579

    Google Scholar 

  19. Dautenhahn K, Nehaniv CL, Walters ML, Robins B, Kose-Bagci H, Assif N, Blow M (2009) Kaspar–a minimally expressive humanoid robot for human-robot interaction research. Appl Bionics Biomech 6(3–4):369–397

    Google Scholar 

  20. De Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 13(4):247–268

    Google Scholar 

  21. Denwood MJ et al (2016) runjags: An r package providing interface utilities, model templates, parallel computing methods and additional distributions for mcmc models in jags. J Stat Softw 71(9):1–25

    Google Scholar 

  22. David DS, Cynthia B, Frank Robert H, David P, Jolie B, Leah D, Joo LJ (2012) Detecting the trustworthiness of novel partners in economic exchange. Psychol Sci 23(12):1549–1556

    Google Scholar 

  23. Destephe M , Brandao M, Kishi T, Zecca M, Hashimoto K, Takanishi A (2014)Emotional gait: effects on humans’ perception of humanoid robots. In: The 23rd IEEE international symposium on robot and human interactive communication, pp 261–266. IEEE

  24. Dittrich Winand H, Tom T, Lea Stephen EG, Dawn M (1996) Perception of emotion from dynamic point-light displays represented in dance. Perception 25(6):727–738

    Google Scholar 

  25. Eivers AR, Mara B, Borge Anne IH (2010) Associations between young children’s emotion attributions and prediction of outcome in differing social situations. Br J Dev Psychol 28(2):499–504

    Google Scholar 

  26. Ekman P, Keltner D (1997) Universal facial expressions of emotion. In: Segerstrale UP, Molnar P (eds). Nonverbal communication: where nature meets culture pp. 27–46

  27. Ekman P, Sorenson ER, Friesen WV (1969) Pan-cultural elements in facial displays of emotion. Science 164(3875):86–88

    Google Scholar 

  28. Erden MS (2013) Emotional postures for the humanoid-robot nao. Int J Soc Robot 5(4):441–456

    Google Scholar 

  29. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166

    MATH  Google Scholar 

  30. Glowinski D, Dael N, Camurri A, Volpe G, Mortillaro M, Scherer K (2011) Toward a minimal representation of affective gestures. IEEE Trans Affect Comput 2(2):106–118

    Google Scholar 

  31. Hancock Peter A, Billings Deborah R, Schaefer Kristin E, Chen Jessie YC, De Visser Ewart J, Raja P (2011) A meta-analysis of factors affecting trust in human–robot interaction. Hum Factors 53(5):517–527

    Google Scholar 

  32. Häring M , Bee N, André E (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In: 2011 RO-MAN, pp 204–209. IEEE

  33. Hegel F , Gieselmann S, Peters A , Holthaus P, Wrede B (2011) Towards a typology of meaningful signals and cues in social robotics. In: 2011 RO-MAN, pp 72–78. IEEE

  34. Johnson David O, Cuijpers Raymond H (2019) Investigating the effect of a humanoid robots head position on imitating human emotions. Int J Soc Robot 11(1):65–74

    Google Scholar 

  35. Johnson DO, Cuijpers RH, Juola JF, Torta E, Simonov M, Frisiello A, Bazzani M, Yan W, Weber C, Wermter S et al (2013) Socially assistive robots: a comprehensive approach to extending independent living. Int J Soc Robot 6(2):195–211

    Google Scholar 

  36. Johnson DO, Cuijpers RH, Pollmann K, van de Ven AA (2016) Exploring the entertainment value of playing games with a humanoid robot. Int J Soc Robot 8(2):247–269

    Google Scholar 

  37. Johnson DO, Cuijpers RH, van der Pol D (2013) Imitating human emotions with artificial facial expressions. Int J Soc Robot 5(4):503–513

    Google Scholar 

  38. Kędzierski J, Muszyński R, Zoll C, Oleksy A, Frontkiewicz M (2013) Emysemotive head of a social robot. Int J Soc Robot 5(2):237–249

    Google Scholar 

  39. Kiesler S, Powers A, Fussell SR, Torrey C (2008) Anthropomorphic interactions with a robot and robot-like agent. Social Cognition 26(2):169–181

    Google Scholar 

  40. Koay KL, Lakatos G, Syrdal DS, Gácsi M, Bereczky B, Dautenhahn K, Miklósi A, Walters ML (2013) Hey! there is someone at your door. a hearing robot using visual communication signals of hearing dogs to communicate intent. In: 2013 IEEE symposium on artificial life (ALife), pp 90–97. IEEE

  41. Kruschke J (2014) Doing Bayesian data analysis: a tutorial with R, JAGS, and Stan. Academic Press, Cambridge

    MATH  Google Scholar 

  42. Kühnlenz K, Sosnowski S, Buss M (2010) Impact of animal-like features on emotion expression of robot head eddie. Adv Robot 24(8–9):1239–1255

    Google Scholar 

  43. Lakatos G, Gácsi M, Konok V, Bruder I, Bereczky B, Korondi P, Miklosi A (2014) Emotion attribution to a non-humanoid robot in different social situations. PLoS ONE 9(12):e114207

    Google Scholar 

  44. Landrum Asheley R, Eaves Jr Baxter S, Patrick Shafto (2015) Learning to trust and trusting to learn: a theoretical framework. Trends Cognit Sci 19(3):109–111

    Google Scholar 

  45. Law T, Chita-Tegmark M, Scheutz M (2020) The interplay between emotional intelligence, trust, and gender in human–robot interaction. Int J Soc Robot. https://doi.org/10.1007/s12369-020-00624-1

    Article  Google Scholar 

  46. Law T , Scheutz M Trust: Recent concepts and evaluations in human–robot interaction, Forthcoming

  47. Lee John D, See Katrina A (2004) Trust in automation: designing for appropriate reliance. Hum Factors 46(1):50–80

    Google Scholar 

  48. Liddell Torrin M, Kruschke John K (2018) Analyzing ordinal data with metric models: what could possibly go wrong? J Exp Soc Psychol 79:328–348

    Google Scholar 

  49. Mathur MB, Reichling DB (2009) An uncanny game of trust: social trustworthiness of robots inferred from subtle anthropomorphic facial cues. In: 2009 4th ACM/IEEE international conference on human–robot interaction (HRI), pp 313–314. IEEE

  50. McColl D, Nejat G (2014) Recognizing emotional body language displayed by a human-like social robot. Int J Soc Robot 6(2):261–280

    Google Scholar 

  51. Metta G, Sandini G, Vernon D, Natale L, Nori F (2008) The icub humanoid robot: an open platform for research in embodied cognition. In: Proceedings of the 8th workshop on performance metrics for intelligent systems, pp 50–56

  52. Moreno P, Nunes R, Figueiredo R, Ferreira R, Bernardino A, Santos-Victor J, Beira R, Vargas L, Aragao D, Aragao MV (2016) A humanoid on wheels for assistive robotics. In: Robot 2015: Second Iberian robotics conference, pp 17–28. Springer

  53. Nishio S, Ishiguro H, Hagita N (2007) Geminoid: teleoperated android of an existing person. Hum Robots New Dev 14:343–352

    Google Scholar 

  54. Novikova J, Watts L (2014) A design model of emotional body expressions in non-humanoid robots. In: Proceedings of the second international conference on Human-agent interaction, pp 353–360

  55. Ososky S, Schuster D, Phillips E, Jentsch FG (2013) Building appropriate trust in human–robot teams. In: 2013 AAAI spring symposium series

  56. Rahman SMM, Wang Y (2018) Mutual trust-based subtask allocation for human–robot collaboration in flexible lightweight assembly in manufacturing. Mechatronics 54:94–109

    Google Scholar 

  57. Reinhardt J, Pereira A, Beckert D, Bengler K (2017) Dominance and movement cues of robot motion: a user study on trust and predictability. In: 2017 IEEE international conference on systems, man, and cybernetics (SMC), pp 1493–1498. IEEE

  58. Salem M , Eyssel F, Rohlfing K, Kopp S, Joublin F (2011) Effects of gesture on the perception of psychological anthropomorphism: a case study with a humanoid robot. In: International conference on social robotics, pp 31–41. Springer

  59. Sandoval EB, Brandstetter J, Obaid M, Bartneck C (2016) Reciprocity in human-robot interaction: a quantitative approach through the prisoners dilemma and the ultimatum game. Int J Soc Robot 8(2):303–317

    Google Scholar 

  60. Savery R, Rose R, Weinberg G (2019) Establishing human–robot trust through music-driven robotic emotion prosody and gesture. In: 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN), pp 1–7. IEEE

  61. Sosnowski S, Bittermann A, Kuhnlenz K, Buss M (2006) Design and evaluation of emotion-display eddie. In: 2006 IEEE/RSJ international conference on intelligent robots and systems, pp 3113–3118. IEEE

  62. Sreenivasa M, Soueres P, Laumond J-P (2012) Walking to grasp: modeling of human movements as invariants and an application to humanoid robotics. IEEE Trans Syst Man Cybern Part A: Syst Hum 42(4):880–893

    Google Scholar 

  63. Terada K, Yamauchi A, Ito A (2012) Artificial emotion expression for a robot by dynamic color change. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication, pp 314–321. IEEE

  64. Torta E, Werner F, Johnson DO, Juola JF, Cuijpers RH, Bazzani M, Oberzaucher J, Lemberger J, Lewy H, Bregman J (2014) Evaluation of a small socially-assistive humanoid robot in intelligent homes for the care of the elderly. J Intell Robot Syst 76(1):57–71

    Google Scholar 

  65. Tsiourti C, Weiss A, Wac K, Vincze M (2017) Designing emotionally expressive robots: a comparative study on the perception of communication modalities. In: Proceedings of the 5th international conference on human agent interaction, pp 213–222

  66. Valenti A, Block A, Chita-Tegmark M, Gold M, Scheutz M (2020) Emotion expression in a socially assistive robot for persons with parkinsons disease. In: Proceedings of the 13th ACM international conference on pervasive technologies related to assistive environments, pp 1–10

  67. van Pinxteren MME, Wetzels RWH, Rüger J, Pluymaekers M, Wetzels M (2019) Trust in humanoid robots: implications for services marketing. J Serv Market. https://doi.org/10.1108/JSM-01-2018-0045

    Article  Google Scholar 

  68. Straten CL van, Peter J, Kühne R, Jong Chiara de ,Barco Alex (2018) Technological and interpersonal trust in child-robot interaction: an exploratory study. In: Proceedings of the 6th International conference on human-agent interaction, pp 253–259

  69. Wagner AR (2009) The role of trust and relationships in human–robot social interaction. PhD thesis, Georgia Institute of Technology

  70. Waytz A, Heafner J, Epley N (2014) The mind in the machine: anthropomorphism increases trust in an autonomous vehicle. J Exp Soc Psychol 52:113–117

    Google Scholar 

  71. Zanatto D, Patacchiola M, Goslin J, Cangelosi A (2016) Priming anthropomorphism: can the credibility of humanlike robots be transferred to non-humanlike robots? In: 2016 11th ACM/IEEE international conference on human–robot interaction (HRI), pp 543–544. IEEE

  72. Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A (2009) Whole body emotion expressions for kobian humanoid robotpreliminary experiments with different emotional patterns. In: RO-MAN 2009-The 18th IEEE international symposium on robot and human interactive communication, pp 381–386. IEEE

Download references

Funding

No funding was received for conducting this study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Theresa Law.

Ethics declarations

Conflict of interest

The authors declare that they have noconflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Law, T., de Leeuw, J. & Long, J.H. How Movements of a Non-Humanoid Robot Affect Emotional Perceptions and Trust. Int J of Soc Robotics 13, 1967–1978 (2021). https://doi.org/10.1007/s12369-020-00711-3

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-020-00711-3

Keywords

Navigation