Abstract
This paper explores the expressive capabilities of a swarm of miniature mobile robots within the context of inter-robot interactions and their mapping to the so-called fundamental emotions. In particular, we investigate how motion asnd shape descriptors that are psychologically associated with different emotions can be incorporated into different swarm behaviors for the purpose of artistic expositions. Based on these characterizations from social psychology, a set of swarm behaviors is created, where each behavior corresponds to a fundamental emotion. The effectiveness of these behaviors is evaluated in a survey in which the participants are asked to associate different swarm behaviors with the fundamental emotions. The results of the survey show that most of the research participants assigned to each video the emotion intended to be portrayed by design. These results confirm that abstract descriptors associated with the different fundamental emotions in social psychology provide useful motion characterizations that can be effectively transformed into expressive behaviors for a swarm of simple ground mobile robots.
Similar content being viewed by others
Notes
In this context, the term valence designates the intrinsic attractiveness (positive valence) or aversiveness (negative valence) of an event, object, or situation [20]. The valence of an emotion thus characterizes its positive or negative connotation. Among the fundamental emotions, happiness and surprise have positive valence, while the remaining four—sadness, fear, disgust and anger—are classified under negative valence [51]. On the other hand, the term arousal refers the activation or deactivation associated with an emotion.
References
Ackerman E (2014) Flying LampshadeBots Come Alive in Cirque du Soleil. IEEE Spectrum
Alonso-Mora J, Siegwart R, Beardsley P (2014) Human–robot swarm interaction for entertainment: from animation display to gesture based control. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction, HRI ’14. ACM, New York, pp 98–98
Aronoff J (2006) How we recognize angry and happy emotion in people, places, and things. Cross Cult Res 40(1):83–105
Barakova EI, Lourens T (2010) Expressing and interpreting emotional movements in social games with robots. Pers Ubiquitous Comput 14(5):457–467
Belpaeme T, Baxter P, Read R, Wood R, Cuayáhuitl H, Kiefer B, Racioppa S, Kruijff-Korbayová I, Athanasopoulos G, Enescu V, Looije R, Neerincx M, Demiris Y, Ros-Espinoza R, Beck A, Cañamero L, Hiolle A, Lewis M, Baroni I, Nalin M, Cosi P, Paci G, Tesser F, Sommavilla G, Humbert R (2013) Multimodal child–robot interaction: building social bonds. J Hum Robot Interact 1(2):33–53
Bi T, Fankhauser P, Bellicoso D, Hutter M (2018) Real-time dance generation to music for a legged robot. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). Madrid, pp 1038–1044
Breazeal C (2003) Toward sociable robots. Robot Auton Syst 42:167–175
Bretan M, Hoffman G, Weinberg G (2015) Emotionally expressive dynamic physical behaviors in robots. Int J Hum Comput Stud 78:1–16
Brown L, Kerwin R, Howard AM (2013) Applying behavioral strategies for student engagement using a robotic educational agent. In: 2013 IEEE international conference on systems, man, and cybernetics. pp 4360–4365
Cabibihan JJ, Javed H, Ang M, Aljunied SM (2013) Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. Int J Soc Robot 5(4):593–618
Camurri A, Mazzarino B, Ricchetti M, Timmers R, Volpe G (2004) Multimodal analysis of expressive gesture in music and dance performances. In: Camurri A, Volpe G (eds) Gesture-based communication in human–computer interaction. Springer, Berlin Heidelberg, Berlin, Heidelberg, pp 20–39
Collier GL (1996) Affective synesthesia: extracting emotion space from simple perceptual stimuli. Motiv Emot 20(1):1–32
Cortes J, Martinez S, Karatas T, Bullo F (2004) Coverage control for mobile sensing networks. IEEE Trans Robot Autom 20(2):243–255
de Rooij A, Broekens J, Lamers MH (2013) Abstract expressions of affect. Int J Synth Emot 4(1):1–31
Dean M, D’Andrea R, Donovan M (2008) Robotic chair. Contemporary Art Gallery, Vancouver, B.C
Diaz-Mercado Y, Lee SG, Egerstedt M (2015) Distributed dynamic density coverage for human–swarm interactions. In: 2015 American control conference (ACC). pp 353–358
Dietz G, Jane JE, Washington P, Kim LH, Follmer S (2017) Human perception of swarm robot motion. In: Proceedings of the 2017 CHI conference extended abstracts on human factors in computing systems. Denver, Colorado, pp 2520–2527
Dunstan BJ, Silvera-Tawil D, Koh JTKV, Velonaki M (2016) Cultural robotics: robots as participants and creators of culture. In: Koh JT, Dunstan BJ, Silvera-Tawil D, Velonaki M (eds) Cultural robotics. Springer, Cham, pp 3–13
Ekman P (1993) Facial expression and emotion. Am Psychol 48(4):384–392
Frijda N (1986) The emotions. studies in emotion and social interaction. Cambridge University Press, Cambridge
Goodrich MA, Schultz AC (2007) Human–robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203–275
Hoffman G (2012) Dumb robots, smart phones: a case study of music listening companionship. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication. pp 358–363
Hoffman G, Kubat R, Breazeal C (2008) A hybrid control system for puppeteering a live robotic stage actor. In: RO-MAN 2008—the 17th IEEE international symposium on robot and human interactive communication. pp. 354–359
Hoffman G, Weinberg G (2010) Gesture-based human–robot jazz improvisation. In: 2010 IEEE international conference on robotics and automation. pp 582–587
Izard CE (1977) Human emotions. Emotions, personalityand psychotherapy. Springer, New York
Izard CE (2009) Emotion theory and research: highlights, unanswered questions, and emerging issues. Annu Rev Psychol 60:1–25
Juslin P (2005) From mimesis to catharsis: expression, perception, and induction of emotion in music. Oxford University Press, Oxford
Justh EW, Krishnaprasad PS (2003) Steering laws and continuum models for planar formations. In: 42nd IEEE International conference on decision and control. vol. 4, pp 3609–3614
Knight H, Simmons R (2014) Expressive motion with x, y and theta: Laban effort features for mobile robots. In: The 23rd IEEE international symposium on robot and human interactive communication. pp 267–273
Kolling A, Walker P, Chakraborty N, Sycara K, Lewis M (2016) Human interaction with robot swarms: a survey. IEEE Trans Hum Mach Syst 46(1):9–26
Kozima H, Michalowski MP, Nakagawa C (2009) Keepon. Int J Soc Robot 1(1):3–18
Laban R, Lawrence F (1947) Effort. Macdonald & Evans Ltd, London
LaViers A, Teague L, Egerstedt M (2014) Style-based robotic motion in contemporary dance performance. Springer, Cham, pp 205–229
Lee D, Park S, Hahn M, Lee N (2014) Robot actors and authoring tools for live performance system. In: 2014 International conference on information science applications (ICISA). pp 1–3
Lee JH, Park JY, Nam TJ (2007) emotional interaction through physical movement. Springer, Berlin Heidelberg, pp 401–410
Levillain F, St-Onge D, Zibetti E, Beltrame G (2018) More than the sum of its parts: assessing the coherence and expressivity of a robotic swarm. In: 2018 IEEE international symposium on robot and human interactive communication (RO-MAN). pp 583–588
Lourens T, van Berkel R, Barakova E (2010) Communicating emotions and mental states to robots in a real time parallel framework using laban movement analysis. Robot Auton Syst 58(12):1256–1265 Intelligent Robotics and Neuroscience
Marshall JA, Broucke ME, Francis BA (2004) Formations of vehicles in cyclic pursuit. IEEE Trans Autom Control 49(11):1963–1974
Masuda M, Kato S, Itoh H (2010) A laban-based approach to emotional motion rendering for human–robot interaction. In: Yang HS, Malaka R, Hoshino J, Han JH (eds) Entertainment computing—ICEC 2010. Springer, Berlin Heidelberg, pp 372–380
Nakazawa A, Nakaoka S, Ikeuchi K, Yokoi K (2002) Imitating human dance motions through motion structure analysis. IEEE/RSJ Int Conf Intell Robot Syst 3:2539–2544
Olfati-Saber R (2002) Near-identity diffeomorphisms and exponential \(\epsilon \)-tracking and \(\epsilon \)-stabilization of first-order nonholonomic se(2) vehicles. In: Proceedings of the 2002 American control conference. vol 6, pp 4690–4695
Or J (2009) Towards the development of emotional dancing humanoid robots. Int J Soc Robot 1(4):367
Perkowski M, Bhutada A, Lukac M, Sunardi M (2013) On synthesis and verification from event diagrams in a robot theatre application. In: 2013 IEEE 43rd international symposium on multiple-valued logic. pp 77–83
Perkowski M, Sasao T, Kim JH, Lukac M, Allen J, Gebauer S (2005) Hahoe KAIST robot theatre: learning rules of interactive robot behavior as a multiple-valued logic synthesis problem. In: 35th International symposium on multiple-valued Logic (ISMVL’05), pp 236–248
Pickem D, Lee M, Egerstedt M (2015) The GRITSBot in its natural habitat–a multi-robot testbed. 2015 IEEE International conference on robotics and automation (ICRA). Seattle, WA, pp 4062–4067
Plutchik R (2001) The nature of emotions: human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am Sci 89(4):344–350
Pollick FE, Paterson HM, Bruderlin A, Sanford AJ (2001) Perceiving affect from arm movement. Cognition 82(2):B51–B61
Ramirez JL, Pavone M, Frazzoli E, Miller DW (2009) Distributed control of spacecraft formation via cyclic pursuit: theory and experiments. In: 2009 American control conference. pp 4811–4817
Rimé B, Boulanger B, Laubin P, Richir M, Stroobants K (1985) The perception of interpersonal emotions originated by patterns of movement. Motiv Emot 9(3):241–260
Ross RT (1938) A statistic for circular series. J Educ Psychol 29(5):384–389
Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178
Schoellig AP, Siegel H, Augugliaro F, D’Andrea R (2014) So you think you can dance? Rhythmic flight performances with quadrocopters. Springer, Cham, pp 73–105
Sheridan TB (2016) Human’robot interaction: status and challenges. Hum Fact 58(4):525–532
Shinozaki K, Iwatani A, Nakatsu R (2008) Construction and evaluation of a robot dance system. In: RO-MAN 2008—the 17th IEEE international symposium on robot and human interactive communication. pp 366–370
St-Onge D, Levillain F, Elisabetta Z, Beltrame G (2019) Collective expression: How robotic swarms convey information with group motion. Paladyn J Behav Robot 10:418–435
Sunardi M, Perkowski M (2018) Music to motion: using music information to create expressive robot motion. Int J Soc Robot 10(1):43–63
Vlachos E, Jochum E, Demers L (2018) Heat: The harmony exoskeleton self-assessment test. In: 2018 27th IEEE international symposium on robot and human interactive communication (RO-MAN). pp 577–582
Wilson S, Glotfelter P, Wang L, Mayya S, Notomista G, Mote M, Egerstedt M (2020) The robotarium: globally impactful opportunities, challenges, and lessons learned in remote-access, distributed control of multirobot systems. IEEE Control Syst Mag 40(1):26–44
Funding
This work was supported by “la Caixa” Banking Foundation under Grant LCF/BQ/AA16/11580039.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work was supported by “la Caixa” Banking Foundation under Grant LCF/BQ/AA16/11580039.
Appendices
A Swarm Behaviors
In Sect. 3.1, a series of swarm behaviors were designed based on the movement and shape attributes associated with the different fundamental emotions. This appendix includes the mathematical expressions of the control laws used to produce the different swarm behaviors. Note that all the control laws included here treat each robot in the swarm as a point that can move omnidirectionally according to single integrator dynamics as in (1). The transformation from single integrator dynamics to unicycle dynamics is discussed in detail in Appendix B.
1.1 A.1 Happiness
The swarm movement selected for the happiness behavior consists of the robots following the contour of a circle with a superimposed sinusoid. This shape is illustrated in Fig. 15a and can be parameterized as
where R is the radius of the main circle and A and f are the amplitude and frequency of the superposed sinusoid, respectively. For the shape in Fig. 15a, the frequency of the superimposed sinusoid is \(f=6\).
If we have a swarm of N robots, we can initially position Robot i according to
with
Then the team will depict the desired shape if each robot follows a point evolving along the contour in (3),
with \(\theta _i\) a function of time \(t\in \mathbb {R}_+\),
1.2 A.2 Surprise
In the case of the surprise emotion, each robot follows a point moving along a circle with expanding radius, as in Fig. 15b. Such shape can be parameterized as,
with
to create a radius that expands from \(R_{min}\) to \( R_{max}\).
Analogously to the procedure described in Appendix A.1, in this case the robots can be initially located at
with \(\theta _i(0)\) given by (5). The controller for each robot is then given by,
with \(\theta _i(t)\) as in (7).
1.3 A.3 Sadness
For the case of the sadness emotion, the robots move along a circle of small dimension as compared to the domain. The strategy is analogous to the ones in (6) and (11), with the parameterization of the contour given by,
1.4 A.4 Anger, Fear and Disgust
For the remaining emotions—anger, disgust and fear—the swarm coordination is based on the coverage control strategy, which allows the user to define which areas the robots should concentrate around.
If we denote by D the domain of the robots, the areas where we want to position the robots can be specified by defining a density function, \(\phi :D\rightarrow [0,\infty )\), that assigns higher values to those areas where we desire the robots to concentrate around. We can make the robots distribute themselves according to this density function by implementing a standard coverage controller such as [13], where
where \(p = [p_1^T,\dots ,p_N^T]^N\) denotes the aggregate positions of the robots and \(\kappa >0\) is a proportional gain. In the controller in (13), \(c_i(p)\) denotes the center of mass of the Voronoi cell of Robot i,
with the Voronoi cell being characterized as,
Figure 16 shows the densities selected for each of the emotions, where the red circles represent the positions of the robots in the domain upon convergence, achieved by running the controller in (13).
B Individual Robot Control
The swarm behaviors described in Appendix A assume that each robot in the swarm can move omnidirectionally according to
with \(p_i=(x_i, y_i)^T\in \mathbb {R}^2\) the Cartesian position of Robot i in the plane and \(u_i=(u_{ix}, u_{iy})^T\in \mathbb {R}^2\) the desired velocity. However, the GRITSBot (Fig. 1) has a differential-drive configuration and cannot move omnidirectionally as its motion is constrained in the direction perpendicular to its wheels. Instead, its motion can be expressed as unicycle dynamics,
with \(\theta _i\) the orientation of Robot i and \((v_i, \omega _i)^T\) the linear and angular velocities executable by the robot, as shown in Fig. 17.
In this paper, the single integrator dynamics in (16) are converted into unicycle dynamics, as in (17), using a near-identity diffeomorphism [41],
A graphical representation of this transformation is included in Fig. 17: the input \(u = (u_x, u_y)^T\) is applied to a point located at a distance of l in front of the robot, \(\tilde{p}\), which can move according to the single integrator dynamics in (16). The effect of this parameter in the movement of the robot is illustrated in Fig. 9. The parameter K acts as a proportional gain.
Rights and permissions
About this article
Cite this article
Santos, M., Egerstedt, M. From Motions to Emotions: Can the Fundamental Emotions be Expressed in a Robot Swarm?. Int J of Soc Robotics 13, 751–764 (2021). https://doi.org/10.1007/s12369-020-00665-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-020-00665-6