TY - CONF
T1 - Induction of the being-seen-feeling by an embodied conversational agent in a socially interactive context
T2 - 21st ACM International Conference on Intelligent Virtual Agents
Y1 - 2021
A1 - Mickaëlla Grondin-Verdon
A1 - Nezih Younsi
A1 - Michele Grimaldi
A1 - Catherine Pelachaud
A1 - Laurence Chaby
A1 - Lola Cañamero
JF - 21st ACM International Conference on Intelligent Virtual Agents
UR - https://hal.archives-ouvertes.fr/hal-03342893/document
N1 - Download (Open Access)
ER -
TY - JOUR
T1 - Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children
JF - International Journal of Social Robotics
Y1 - 2013
A1 - Aryel Beck
A1 - Lola Cañamero
A1 - Antoine Hiolle
A1 - Luisa Damiano
A1 - Cosi, Piero
A1 - Tesser, Fabio
A1 - Sommavilla, Giacomo
KW - emotion
KW - emotional body language
KW - perception
KW - Social robotics
AB - The work reported in this paper focuses on giving humanoid robots the capacity to express emotions with their body. Previous results show that adults are able to interpret different key poses displayed by a humanoid robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy) and valence (positive or negative emotion) whereas moving the head up produces an increase along these dimensions. Hence, changing the head position during an interaction should send intuitive signals. The study reported in this paper tested children’s ability to recognize the emotional body language displayed by a humanoid robot. The results suggest that body postures and head position can be used to convey emotions during child-robot interaction.
VL - 5
UR - https://link.springer.com/article/10.1007/s12369-013-0193-z
N1 - Download
ER -
TY - CONF
T1 - Interpretation of Emotional Body Language Displayed by Robots
T2 - Proc. 3rd International Workshop on Affective Interaction in Natural Environments, AFFINE'10
Y1 - 2010
A1 - Aryel Beck
A1 - Antoine Hiolle
A1 - Alexandre Mazel
A1 - Lola Cañamero
AB - In order for robots to be socially accepted and generate empathy they must display emotions. For robots such as Nao, body language is the best medium available, as they do not have the ability to display facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should greatly improve its acceptance. This research investigates the creation of an "Affect Space" for the generation of emotional body language that could be displayed by robots. An Affect Space is generated by "blending" (i.e. interpolating between) different emotional expressions to create new ones. An Affect Space for body language based on the Circumplex Model of emotions has been created. The experiment reported in this paper investigated the perception of specific key poses from the Affect Space. The results suggest that this Affect Space for body expressions can be used to improve the expressiveness of humanoid robots. In addition, early results of a pilot study are described. It revealed that the context helps human subjects improve their recognition rate during a human-robot imitation game, and in turn this recognition leads to better outcome of the interactions.
JF - Proc. 3rd International Workshop on Affective Interaction in Natural Environments, AFFINE'10
PB - ACM
CY - Firenze, Italy
SN - 978-1-4503-0170-1
ER -
TY - CONF
T1 - The Importance of the Body in Affect-Modulated Action Selection: A Case Study Comparing Proximal Versus Distal Perception in a Prey-Predator Scenario
T2 - Proc. 3rd Intl. Conference on Affective Computing and Intelligent Interaction (ACII 2009)
Y1 - 2009
A1 - O'Bryne, Claire
A1 - Lola Cañamero
A1 - John C Murray
AB - In the context of the animat approach, we investigate the effect of an emotion-like hormonal mechanism, as a modulator of perception - and second order controller to an underlying motivation-based action selection architecture - on brain-body-environment interactions within a prey-predator scenario. We are particularly interested in the effects that affective modulation of different perceptual capabilities has on the dynamics of interactions between predator and prey, as part of a broader study of the adaptive value of emotional states such as "fear" and "aggression" in the context of action selection. In this paper we present experiments where we modulated the architecture of a prey robot using two different types of sensory capabilities, proximal and distal, effectively creating combinations of different prey "brains" and "bodies".
JF - Proc. 3rd Intl. Conference on Affective Computing and Intelligent Interaction (ACII 2009)
PB - IEEE Press
CY - Amsterdam, The Netherlands
ER -
TY - CONF
T1 - The Influence of Social Interaction on the Perception of Emotional Expression: A Case Study with a Robot Head
T2 - Advances in Robotics: Proc. FIRA RoboWorld Congress 2009
Y1 - 2009
A1 - John C Murray
A1 - Lola Cañamero
A1 - Kim A. Bard
A1 - Ross, Marina Davila
A1 - Thorsteinsson, Kate
ED - Kim, Jong-Hwan
ED - Ge, Shuzhi Sam
ED - Vadakkepat, Prahlad
ED - Jesse, Norbert
ED - Al Manum, Abdullah
ED - Puthusserypady K, Sadasivan
ED - Rückert, Ulrich
ED - Sitte, Joaquin
ED - Witkowski, Ulf
ED - Nakatsu, Ryohei
ED - Braunl, Thomas
ED - Baltes, Jacky
ED - Anderson, John
ED - Wong, Ching-Chang
ED - Verner, Igor
ED - Ahlgren, David
AB - In this paper we focus primarily on the influence that socio-emotional interaction has on the perception of emotional expression by a robot. We also investigate and discuss the importance of emotion expression in socially interactive situations involving human robot interaction (HRI), and show the importance of utilising emotion expression when dealing with interactive robots, that are to learn and develop in socially situated environments. We discuss early expressional development and the function of emotion in communication in humans and how this can improve HRI communications. Finally we provide experimental results showing how emotion-rich interaction via emotion expression can affect the HRI process by providing additional information.
JF - Advances in Robotics: Proc. FIRA RoboWorld Congress 2009
T3 - Lecture Notes in Computer Science
PB - Springer Berlin Heidelberg
CY - Incheon, Korea
VL - 5744
SN - 978-3-642-03983-6
UR - https://link.springer.com/chapter/10.1007%2F978-3-642-03983-6_10
ER -
TY - CONF
T1 - Introducing Neuromodulation to a Braitenberg Vehicle
T2 - Proc. 2005 IEEE Int. Conf. on Robotics and Automation: Robots get Closer to Humans (ICRA'05)
Y1 - 2005
A1 - French, Richard L B
A1 - Lola Cañamero
AB - Artificial neural networks are often used as the control systems for mobile robots. However, although these models usually claim inspiration from biology, they often lack an analogue of the biological phenomenon called neuromodulation. In this paper, we describe our initial work exploring a simple model of neuromodulation, used to provide a mobile robot with foraging behaviour.
JF - Proc. 2005 IEEE Int. Conf. on Robotics and Automation: Robots get Closer to Humans (ICRA'05)
PB - IEEE Press
CY - Barcelona, Spain
SN - 0-7803-8914-X
UR - http://ieeexplore.ieee.org/abstract/document/1570763/
ER -
TY - JOUR
T1 - Intelligenza artificiale in medicina: progetto di una piattaforma mobile inserita in un ambiente intelligente per l'assistenza ai disabili e agli anziani
JF - Recenti Progressi in Medicina
Y1 - 2004
A1 - Cortés, Ulises
A1 - Annicchiarico, Roberta
A1 - Campana, Fabio
A1 - Vázquez-Salceda, Javier
A1 - Urdiales, Cristina
A1 - Lola Cañamero
A1 - Maite López
A1 - Miquel Sànchez-Marrè
A1 - Di Vincenzo, Sarah
A1 - Carlo Caltagirone
AB - Viene presentato un progetto basato sull'integrazione di nuove tecnologie e di Intelligenza artificiale per sviluppare uno strumento – e-tool – indirizzato alle persone disabili ed agli anziani. Una piattaforma mobile inserita all'interno di ambienti intelligenti (strutture di assistenza o abitazioni), controllata e gestita attraverso un'architettura multilivello, viene proposta come supporto sia per i pazienti che per i caregiver al fine di aumentare l'autonomia nella vita quotidiana. A project based on the integration of new technologies and artificial intelligence to develop a device – e-tool – for disabled patients and elderly people is presented. A mobile platform in intelligent environments (skilled-care facilities and home-care), controlled and managed by a multi-level architecture, is proposed to support patients and caregivers to increase self-dependency in activities of daily living.
PB - Pensiero scientifico
VL - 95
IS - 4
ER -
TY - JOUR
T1 - I Show You How I Like You—Can You Read it in My Face?
JF - IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans
Y1 - 2001
A1 - Cañamero, Lola D
A1 - Fredslund, Jakob
AB - We report work on a LEGO robot that displays different emotional expressions in response to physical stimulation, for the purpose of social interaction with humans. This is a first step toward our longer-term goal of exploring believable emotional exchanges to achieve plausible interaction with a simple robot. Drawing inspiration from theories of human basic emotions, we have implemented several prototypical expressions in the robot’s caricaturized face and conducted experiments to assess the recognizability of these expressions.
PB - IEEE
VL - 31
UR - http://ieeexplore.ieee.org/document/952719/
IS - 5
ER -
TY - RPRT
T1 - I Show You How I Like You: Human-Robot Interaction through Emotional Expression and Tactile Stimulation
Y1 - 2000
A1 - Cañamero, Lola D
A1 - Fredslund, Jakob
AB - We report work on a LEGO robot capable of displaying several emotional expressions in response to physical contact. Our motivation has been to explore believable emotional exchanges to achieve plausible interaction with a simple robot. We have worked toward this goal in two ways. First, acknowledging the importance of physical manipulation in children's interactions, interaction with the robot is through tactile stimulation; the various kinds of stimulation that can elicit the robot's emotions are grounded in a model of emotion activation based on different stimulation patterns. Second, emotional states need to be clearly conveyed. We have drawn inspiration from theories of human basic emotions with associated universal facial expressions, which we have implemented in a caricaturized face. We have conducted experiments on both children and adults to assess the recognizability of these expressions.
JF - Dept. of Computer Science Technical Report DAIMI PB 544
PB - University of Aarhus, Denmark
UR - http://ojs.statsbiblioteket.dk/index.php/daimipb/article/view/7078
ER -
TY - CONF
T1 - Imitating Human Performances to Automatically Generate Expressive Jazz Ballads
T2 - Proc. AISB'99 Symposium on Imitation in Animals and Artifacts
Y1 - 1999
A1 - D Cañamero
A1 - Josep Lluís Arcos
A1 - Ramon López de Mántaras
AB - One of the main problems with the automatic generation of expressive musical performances is to grasp the way in which human performers use musical knowledge that is not explicitly noted in musical scores. Moreover, this knowledge is tacit, difficult to verbalize, and therefore it must be acquired through a process of observation, imitation, and experimentation. For this reason, AI approaches based on declarative knowledge representations have serious limitations. An alternative approach is that of directly using the implicit knowledge that is in examples from recordings of human performances. In this paper, we describe a case-based reasoning system that generates expressive musical performances imitating examples of expressive human performances.
JF - Proc. AISB'99 Symposium on Imitation in Animals and Artifacts
PB - AISB
CY - Edinburgh, Scotland
ER -
TY - CONF
T1 - Issues in the Design of Emotional Agents
T2 - Emotional and Intelligent: The Tangled Knot of Cognition. Papers from the 1998 AAAI Fall Symposium
Y1 - 1998
A1 - D Cañamero
ED - D Cañamero
JF - Emotional and Intelligent: The Tangled Knot of Cognition. Papers from the 1998 AAAI Fall Symposium
PB - AAAI Press
ER -