Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children

TitleInterpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children
Publication TypeJournal Article
Year of Publication2013
AuthorsBeck, A., Cañamero L., Hiolle A., Damiano L., Cosi P., Tesser F., & Sommavilla G.
JournalInternational Journal of Social Robotics
Volume5
Pagination325–334
ISSN Number1875-4791
Keywordsemotion, emotional body language, perception, Social robotics
Abstract

The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body. Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up produces an increase along these dimensions. Hence, changing the head position during an interaction should send intuitive signals. The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot. The results suggest that body postures and head position can be used to convey emotions during child-robot interaction.

URLhttps://link.springer.com/article/10.1007/s12369-013-0193-z
DOI10.1007/s12369-013-0193-z