%0 Journal Article %J ACM Transactions on Interactive Intelligent Systems %D 2012 %T Eliciting Caregiving Behavior in Dyadic Human-robot Attachment-like Interactions %A Antoine Hiolle %A Lola Cañamero %A Davila-Ross, Marina %A Kim A. Bard %X We present here the design and applications of an arousal-based model controlling the behavior of a Sony AIBO robot during the exploration of a novel environment: a children's play mat. When the robot experiences too many new perceptions, the increase of arousal triggers calls for attention towards its human caregiver. The caregiver can choose to either calm the robot down by providing it with comfort, or to leave the robot coping with the situation on its own. When the arousal of the robot has decreased, the robot moves on to further explore the play mat. We gathered results from two experiments using this arousal-driven control architecture. In the first setting, we show that such a robotic architecture allows the human caregiver to influence greatly the learning outcomes of the exploration episode, with some similarities to a primary caregiver during early childhood. In a second experiment, we tested how human adults behaved in a similar setup with two different robots: one “needy”, often demanding attention, and one more independent, requesting far less care or assistance. Our results show that human adults recognise each profile of the robot for what they have been designed, and behave accordingly to what would be expected, caring more for the needy robot than for the other. Additionally, the subjects exhibited a preference and more positive affect whilst interacting and rating the robot we designed as needy. This experiment leads us to the conclusion that our architecture and setup succeeded in eliciting positive and caregiving behavior from adults of different age groups and technological background. Finally, the consistency and reactivity of the robot during this dyadic interaction appeared crucial for the enjoyment and engagement of the human partner. %B ACM Transactions on Interactive Intelligent Systems %I ACM %C New York, NY %V 2 %P 3:1–3:24 %G eng %U https://dl.acm.org/doi/10.1145/2133366.2133369 %N 1 %R 10.1145/2133366.2133369 %0 Journal Article %J ACM Transactions on Interactive Intelligent Systems %D 2012 %T Emotional Body Language Displayed by Artificial Agents %A Aryel Beck %A Stevens, Brett %A Kim A. Bard %A Lola Cañamero %X Complex and natural social interaction between artificial agents (computer-generated or robotic) and humans necessitates the display of rich emotions in order to be believable, socially relevant, and accepted, and to generate the natural emotional responses that humans show in the context of social interaction, such as engagement or empathy. Whereas some robots use faces to display (simplified) emotional expressions, for other robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve naturalness. This research investigates the creation of an affect space for the generation of emotional body language to be displayed by humanoid robots. To do so, three experiments investigating how emotional body language displayed by agents is interpreted were conducted. The first experiment compared the interpretation of emotional body language displayed by humans and agents. The results showed that emotional body language displayed by an agent or a human is interpreted in a similar way in terms of recognition. Following these results, emotional key poses were extracted from an actor's performances and implemented in a Nao robot. The interpretation of these key poses was validated in a second study where it was found that participants were better than chance at interpreting the key poses displayed. Finally, an affect space was generated by blending key poses and validated in a third study. Overall, these experiments confirmed that body language is an appropriate medium for robots to display emotions and suggest that an affect space for body expressions can be used to improve the expressiveness of humanoid robots. %B ACM Transactions on Interactive Intelligent Systems %I ACM %C New York, NY %V 2 %P 2:1–2:29 %G eng %U https://dl.acm.org/doi/10.1145/2133366.2133368 %N 1 %R 10.1145/2133366.2133368 %0 Conference Paper %B Proc. 19th Annual IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2010) %D 2010 %T Towards an Affect Space for Robots to Display Emotional Body Language %A Aryel Beck %A Lola Cañamero %A Kim A. Bard %X In order for robots to be socially accepted and generate empathy it is necessary that they display rich emotions. For robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve its sociability. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by robots. To create an Affect Space for body language, one has to establish the contribution of the different positions of the joints to the emotional expression. The experiment reported in this paper investigated the effect of varying a robot's head position on the interpretation, Valence, Arousal and Stance of emotional key poses. It was found that participants were better than chance level in interpreting the key poses. This finding confirms that body language is an appropriate medium for robot to express emotions. Moreover, the results of this study support the conclusion that Head Position is an important body posture variable. Head Position up increased correct identification for some emotion displays (pride, happiness, and excitement), whereas Head Position down increased correct identification for other displays (anger, sadness). Fear, however, was identified well regardless of Head Position. Head up was always evaluated as more highly Aroused than Head straight or down. Evaluations of Valence (degree of negativity to positivity) and Stance (degree to which the robot was aversive to approaching), however, depended on both Head Position and the emotion displayed. The effects of varying this single body posture variable were complex. %B Proc. 19th Annual IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2010) %I IEEE %C Viareggio, Italy %P 464–469 %@ 978-1-4244-7991-7 %G eng %R 10.1109/ROMAN.2010.5598649 %0 Conference Paper %B Proc. 18th Annual IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2009) %D 2009 %T Assessing Human Responses to Different Robot Attachment Profiles %A Antoine Hiolle %A Kim A. Bard %A Lola Cañamero %X Emotional regulation is believed to be crucial for a balanced emotional and cognitive development in infants. Furthermore, during the first year of a child's life, the mother is playing a central role in shaping the development, through the attachment bond she shares with her child. Based on previous work on our model of arousal modulation for an autonomous robot, we present an experiment where human adults were interacting visually and via tactile contact with a SONY Aibo robot exploring a children playmat. The robots had two different attachment profiles: one requiring less attention then the other. The subjects answered one questionnaire per robot, describing how they would rate their experience with each robot. The analysis of the subjects' responses allow us to conclude that this setting was sufficient to elicit positive and active caretaking-like behaviours from the subjects, according to the profile of the robot they interacted with. %B Proc. 18th Annual IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2009) %I IEEE Press %C Toyama, Japan %P 251–256 %8 09/2009 %@ 978-1-4244-5081-7 %G eng %U http://ieeexplore.ieee.org/document/5326216/ %R 10.1109/ROMAN.2009.5326216 %0 Conference Paper %B Advances in Robotics: Proc. FIRA RoboWorld Congress 2009 %D 2009 %T The Influence of Social Interaction on the Perception of Emotional Expression: A Case Study with a Robot Head %A John C Murray %A Lola Cañamero %A Kim A. Bard %A Ross, Marina Davila %A Thorsteinsson, Kate %E Kim, Jong-Hwan %E Ge, Shuzhi Sam %E Vadakkepat, Prahlad %E Jesse, Norbert %E Al Manum, Abdullah %E Puthusserypady K, Sadasivan %E Rückert, Ulrich %E Sitte, Joaquin %E Witkowski, Ulf %E Nakatsu, Ryohei %E Braunl, Thomas %E Baltes, Jacky %E Anderson, John %E Wong, Ching-Chang %E Verner, Igor %E Ahlgren, David %X In this paper we focus primarily on the influence that socio-emotional interaction has on the perception of emotional expression by a robot. We also investigate and discuss the importance of emotion expression in socially interactive situations involving human robot interaction (HRI), and show the importance of utilising emotion expression when dealing with interactive robots, that are to learn and develop in socially situated environments. We discuss early expressional development and the function of emotion in communication in humans and how this can improve HRI communications. Finally we provide experimental results showing how emotion-rich interaction via emotion expression can affect the HRI process by providing additional information. %B Advances in Robotics: Proc. FIRA RoboWorld Congress 2009 %S Lecture Notes in Computer Science %I Springer Berlin Heidelberg %C Incheon, Korea %V 5744 %P 63–72 %8 08/2009 %@ 978-3-642-03983-6 %G eng %U https://link.springer.com/chapter/10.1007%2F978-3-642-03983-6_10 %R 10.1007/978-3-642-03983-6_10