Part of the book series:Lecture Notes in Computer Science ((LNAI,volume 7072))
Included in the following conference series:
2838Accesses
Abstract
Previous results show that adults are able to interpret different key poses displayed by the robot and also that changing the head position affects the expressiveness of the key poses in a consistent way. Moving the head down leads to decreased arousal (the level of energy), valence (positive or negative) and stance (approaching or avoiding) whereas moving the head up produces an increase along these dimensions [1]. Hence, changing the head position during an interaction should send intuitive signals which could be used during an interaction. The ALIZ-E target group are children between the age of 8 and 11. Existing results suggest that they would be able to interpret human emotional body language [2, 3].
Based on these results, an experiment was conducted to test whether the results of [1] can be applied to children. If yes body postures and head position could be used to convey emotions during an interaction.
This is a preview of subscription content,log in via an institution to check access.
Access this chapter
Subscribe and save
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
Buy Now
- Chapter
- JPY 3498
- Price includes VAT (Japan)
- eBook
- JPY 5719
- Price includes VAT (Japan)
- Softcover Book
- JPY 7149
- Price includes VAT (Japan)
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Beck, A., Cañamero, L., Bard, K.: Towards an affect space for robots to display emotional body language. In: Ro-Man 2010. IEEE, Viareggio (2010)
Boone, R.T., Cunningham, J.G.: The Attribution of Emotion to Expressive Body Movements: A Structural Cue Analysis (1996) (manuscript)
Boone, R.T., Cunningham, J.G.: Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev. Psychol. 34(5), 1007–1016 (1998)
Gillies, M., et al.: Responsive listening behavior. Computer Animation and Virtual Worlds 19(5), 579–589 (2008)
Breazeal, C.: Designing sociable robots. In: Intelligent Robotics & Autonomous Agents. MIT Press, Cambridge (2002)
Beck, A., Stevens, B., Bard, K.: Comparing perception of affective body movements displayed by actors and animated characters. In: AISB 2009, Edinburgh, UK (2009)
de Gelder, B.: Towards the neurobiology of emotional body language. Nature Reviews Neuro-Science 7(3), 242–249 (2006)
Kleinsmith, A., De Silva, P.R., Bianchi-Berthouze, N.: Cross-cultural differences in recognizing affect from body posture. Interacting with Computers 18(6), 1371–1389 (2006)
Thomas, F., Johnston, O.: The illusion of life. Abbeville Press, New-York (1995)
Ekman, P., Friesen, W.V., Hager, J.C.: Facial action coding system. The manual. Human Face, Salt Lake (2002)
Cassell, J.: Nudge nudge wink wink: elements of face-to-face conversation. In: Cassell, J., et al. (eds.) Embodied Conversational Agents, pp. 1–27. MIT Press, Cambridge (2000)
Vinayagamoorthy, V., et al.: Building Expression into Virtual Characters. In: Eurographics 2006. Proc. Eurographics, Vienna (2006)
De Silva, P.R., Bianchi-Berthouze, N.: Modeling human affective postures: an information theoretic characterization of posture features. Computer Animation and Virtual Worlds 15(3-4), 269–276 (2004)
Atkinson, A.P., et al.: Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33, 717–746 (2004)
Wallbott, H.G.: Bodily expression of emotion. European Journal of Social Psychology 28(6), 879–896 (1998)
Pollick, F.E., et al.: Perceiving affect from arm movement. Cognition 82(2), 51–61 (2001)
Kleinsmith, A., Bianchi-Berthouze, N., Steed, A.: Automatic Recognition of Non-Acted Affective Postures. IEEE Trans. Syst. Man Cybern. B Cybern. (2011)
Schouwstra, S., Hoogstraten, J.: Head position and spinal position as determinants of perceived emotional state. Perceptual and Motor Skills 81(2), 673–674 (1995)
Maestri, G.: Digital character animation. In: Kissane, E., Kalning, K. (eds.), 3rd edn. New Riders, Berkeley (2006)
Tonks, J., et al.: Assessing emotion recognition in 9-15-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Inj. 21(6), 623–629 (2007)
Woods, S., Dautenhahn, K., Schultz, J.: Child and adults’ perspectives on robot appearance. In: AISB 2005 Symposium on Robot Companion, pp. 126–132. SSAISB, Hatfield (2005)
Beck, A., et al.: Interpretation of Emotional Body Language Displayed by Robots. In: Affine 2010. ACM, Firenze (2010)
Author information
Authors and Affiliations
Adaptive Systems Research Group, School of Computer Science & STRI, University of Hertfordshire, United Kingdom
Aryel Beck, Lola Cañamero & Luisa Damiano
Institute of Cognitive Sciences and Technologies, Padova, Italy
Giacomo Sommavilla, Fabio Tesser & Piero Cosi
- Aryel Beck
You can also search for this author inPubMed Google Scholar
- Lola Cañamero
You can also search for this author inPubMed Google Scholar
- Luisa Damiano
You can also search for this author inPubMed Google Scholar
- Giacomo Sommavilla
You can also search for this author inPubMed Google Scholar
- Fabio Tesser
You can also search for this author inPubMed Google Scholar
- Piero Cosi
You can also search for this author inPubMed Google Scholar
Editor information
Editors and Affiliations
Department of Computer Sciences, University of Wisconsin-Madison, 1210 W. Dayton St. Room 6381, 53706-1685, Madison, WI, USA
Bilge Mutlu
Canterbury University, HIT Lab NZ Old Math Building, Room 108, Private Bag 4800, 8140, Christchurch, New Zealand
Christoph Bartneck
Department of Innovation Sciences, Eindhoven University of Technology, Section Human-Technology Interaction, IPO 1.36, PO Box 513, 5600 MB, Eindhoven, The Netherlands
Jaap Ham
Department of Electrical Engineering, Mathematics and Computer Science, (EEMCS). Chair: Human Media Interaction (HMI), University of Twente, PO Box 217, NL-7500 AE, Enschede, The Netherlands
Vanessa Evers
Department of Communication, Laboratories Intelligent Robotics and Communication Laboratories ATR (Advanced Telecommunications Research Institute International), 2-2-2 Hikaridai, 619-0288, Keihanna Science City, Kyoto, Japan
Takayuki Kanda
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Beck, A., Cañamero, L., Damiano, L., Sommavilla, G., Tesser, F., Cosi, P. (2011). Children Interpretation of Emotional Body Language Displayed by a Robot. In: Mutlu, B., Bartneck, C., Ham, J., Evers, V., Kanda, T. (eds) Social Robotics. ICSR 2011. Lecture Notes in Computer Science(), vol 7072. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25504-5_7
Download citation
Publisher Name:Springer, Berlin, Heidelberg
Print ISBN:978-3-642-25503-8
Online ISBN:978-3-642-25504-5
eBook Packages:Computer ScienceComputer Science (R0)
Share this paper
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative