« GO Back

Presentation by Dr. Antonio Camurri

Music and emotion: Toward computational models of empathy and entrainment

Abstract: The study of human intended and unintended interpersonal co-ordination is one of the most interesting and challenging topics in the psychological and behavioral sciences, and, in the recent years, also in human-computer interaction, social media and interactive multimedia systems. Scientific research aims at developing intelligent social interfaces, with a focus on non-verbal communication, embodiment, emotion, and enaction (e.g., Camurri and Frisoli 2006). As in natural sciences and medicine, the co-ordination phenomenon is known as entrainment or synchronisation. There is no general accepted scientific definition of entrainment. Pikovsky et al. (2001) define it as “an adjustment of rhythms of oscillating objects due to their weak interaction”. Entrainment and related phenomena can be studied focusing on different kinds of synchronisation (phase synchronisation, general synchronisation, complete synchronisation), on different experimental conditions (e.g., “passive” or “active” experiments), and on physical observables (e.g. physiological data, motor behavior data, gesture, audio signals). In our research, we focus on gesture as simple physical signals, and as expressive gestures, that is gesture as conveyer of non-verbal emotional content (e.g., Camurri et al. 2004; Castellano et al. 2008). In this seminar, I present research in course at our Centre investigating how non-verbal, full-body expressive gestural communication can play a relevant role in entraining people under different perceptual coupling strengths and inducted emotional states. We aim at designing computational models of entrainment, and to develop novel social, affective interfaces grounded on non-verbal, full-body signals. Music is an ideal field to study such phenomena. Let us consider for example a string quartet: they are highly trained in coordinating, and are able to achieve empathic behavior enabling them to convey emotion to their audience, using only non-verbal signals (gesture, music). We may argue that entrainment occurs among the musicians playing together, as well as between the musicians and their audience. Our experiments in music and dance aim at iteratively developing and testing computational models of emotional entrainment. Further, the EyesWeb XMI Gesture and Social Processing Software Library is presented, also using video demos. The seminar also discusses a few case studies and experiments developed at our Casa Paganini – InfoMus Intl Research Centre (www.casapaganini.org). The ancient monumental building Casa Paganini, including a concert auditorium and several museum rooms, is chosen as an ecological setting for our experiments. Finally, the seminar also presents examples of exploitation of our research results in artistic projects and in music and multimedia industry, with a particular focus on the social active music listening paradigms developed in the EU ICT SAME Project (www.sameproject.eu) (Camurri et al., 2008).

A. Pikovsky, M.G. Rosemblum, and J. Kurths. Syncronization: a Universal Concept in Nonlinear Science. Cambridge University Press, Cambridge, 2001.
A. Camurri, B. Mazzarino, G. Volpe (2004) Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library, in A. Camurri, G. Volpe (Eds.), Gesture-based Communication in Human-Computer Interaction, LNAI 2915, pp.460-467, Springer Verlag.
A. Camurri, A. Frisoli (Guest Editors) (2006) Special Issue of Virtual Reality Journal on Multisensory Interaction in Virtual Environments, Vol.10, No.1, Springer.
G. Varni, A. Camurri, P. Coletta, G. Volpe (2008) “Emotional Entrainment in Music Performance”, Proc. 8th IEEE Intl Conf on Automatic Face and Gesture Recognition, Sept. 17-19, Amsterdam.
G. Castellano, A. Camurri, M. Mortillaro, K. Scherer, G. Volpe (2008) Expressive Gesture and Music: Analysis of Emotional Behaviour in Music Performance, Music Perception, Vol.25, No.6, pp.103-119, University of California Press.
A. Camurri, C. Canepa, P. Coletta, B. Mazzarino, G. Volpe (2008) “Mappe per Affetti Erranti: a Multimodal System for Social Active Listening and Expressive Performance”. Proc. Intl Conf NIME 2008 – New Interfaces for Musical Expression, Casa Paganini, University of Genoa (www.nime.org).

Bio: Antonio Camurri is an Associate Professor at University of Genova where he teaches “Software engineering” and “Multimedia Systems”. His research interests include human computer interaction, multimodal interfaces for expressive and emotional non-verbal communication, sound and music computing, and interactive multimedia systems. Founder and scientific director of InfoMus Lab (www.infomus.org), member of the ExCom of the IEEE CS Tech. Committee on Computer Generated Music, founding member of the Italian Association for Artificial Intelligence, Associate Editor of the Journal of New Music Research (Taylor & Francis); main contributor to the EU Roadmap on “Sound and Music Computing” (2007, smcnetwork.org); author of more than 100 scientific publications in international journals and conference proceedings; chairman of international events, including the 1st Intl Workshop on Kansei – The Technology of Emotion, 1997; the tracks on Kansei Information Processing at the IEEE Intl Conf SMC’98 and SMC’99; the V Intl Gesture Workshop, 2003; the ENACTIVE 2005 Intl Conference; the New Interfaces for Musical Expression – NIME 2008; Program Chair at the ArTech 2008 Intl Conf, Porto; Keynote speaker at INTEL European Research and Innovation Conference, 2008, Dublin; Keynote speaker at Intl Gesture Workshop 2009, Bielefeld Univ. Co-Editor of the special issue of IEEE Multimedia on “Multisensory Communication and Experience through Multimedia”, 2004. Project Coordinator of the EU ICT STREP project SAME (7FP, Networked Media, 2008-2010, www.sameproject.eu), and of the EU IST project MEGA (Multisensory Expressive Gesture Applications, 5FP, 2001-2003, www.megaproject.org) and local project manager of several EU projects (including TAI-CHI, ENACTIVE, HUMAINE, U-CREATE, MIAMI, S2S^2, ConGAS) and of industry projects (INTEL, SIPRA Spa, Acquario di Genova, Museo del Mare e della Navigazione, Museum of Normandie Roche d’Oetre – Unesco). Founder in 2005 and director of the Intl. Centre of Excellence Casa Paganini – InfoMus Lab (www.casapaganini.org). Owner of five patents on software and music multimedia systems.


Apr 16 2009


4:00 pm - 5:00 pm




Clara Lichtenstein Recital Hall (C-209)
CIRMMT, Strathcona Music Building, 555 Sherbrooke St. West



BRAMS (International Laboratory for Brain, Music and Sound Research) is a unique centre dedicated to research excellence in the study of music and auditory cognition with a focus on neuroscience. The research centre is located in Montreal and jointly affiliated with the University of Montreal and McGill University.


Our civic address
Pavillon Marie-Victorin/ Local A-108
90 Vincent-d’Indy Ave., Outremont, QC H2V 2S9

Our mailing address
BRAMS / UdeM – FAS – Département de psychologie
CP 6128, succ. Centre-ville/ Montréal, QC H3C 3J7

Contact Us

514.343.6111 ext. 3167
Join our Brams mailing list!
Custom Website by webcolours.ca ©2023 | Brams - All Rights Reserved.