event

« GO Back

Presentation by Dr. Eduardo Coutinho

Psychoacoustic cues to emotion in music and speech prosody

Abstract: There is strong evidence of shared acoustic profiles common to the expression of emotions in music and speech, yet relatively limited understanding of the specific psychoacoustic features involved. This study combined a controlled experiment and computational modelling to investigate the perceptual codes associated with the expression of emotion in the acoustic domain. The empirical stage of the study provided continuous human ratings of emotions perceived in excerpts of film music and natural speech samples. The computational stage created a computer model that retrieves the relevant information from the acoustic stimuli and makes predictions about the emotional expressiveness of speech and music close to the responses of human subjects. We show that a significant part of the listeners’ second-by-second reported emotions to music and speech prosody can be predicted from a set of seven psychoacoustic features: loudness, tempo/speech rate, melody/prosody contour, spectral centroid, spectral flux, sharpness, and roughness. The implications of these results are discussed in the context of cross-modal similarities in the communication of emotion in the acoustic domain. I will also discuss some preliminary evidence on the relationships between psychoacoustic features, physiological responses and subjective feelings of emotion.

Bio: Eduardo Coutinho received his degree in Electrical Engineering and Computer Science in 2003 from the University of Porto (Portugal), where he specialized in computational modelling of animal behaviour and multi-agent systems. Following his strong interesting in emotions and music, Coutinho engaged in a doctoral degree at the School of Computing and Mathematics from the University of Plymouth (UK). In 2008, he was awarded his Ph.D. with a thesis entitled Computational and Psycho-Physiological Investigations of Musical Emotions. In it, Coutinho explores the link between emotional responses to music, low-level psychoacoustic features (elements of sound that are perceived similarly across cultures) and self-perception of physiological activation, by means of a novel methodology consisting of computational investigations based on spatiotemporal neural networks sensitive to structural aspects of music and listeners physiological responses. Since then, Coutinho has been researching the psychophysiological effects of music in human emotions, as well as on the expression of emotion in speech. In parallel with his research, he has also developed some projects in composition and interactive art. Currently, Coutinho is the coordinator of the Music and Emotion research focus at the Swiss Center for Affective Sciences where he addresses different aspects of the link between Music and Emotion from a trandisciplinary perspective (http://www.affective-sciences.org/musicfocus), and an honorary research fellow at the School of Music from the University of Liverpool (UK). More information at www.eadward.org.

Date

Jul 31 2013
Expired!

Time

12:00 pm - 1:00 pm

Cost

$free

Location

BRAMS - Suite 0120
1430 boul Mont Royal
Category

Organizer

BRAMS
Email
info@brams.umontreal.ca
logo

BRAMS (International Laboratory for Brain, Music and Sound Research) is a unique laboratory dedicated to research excellence in the study of music and auditory cognition with a focus on neuroscience. BRAMS is located in Montreal and jointly affiliated with the University of Montreal and McGill University.

Address

Our civic address
Pavillon Marie-Victorin, Local A-108
90 Vincent-d’Indy Ave., Outremont, QC H2V 2S9

Our mailing address
BRAMS / UdeM Département de psychologie
C.P. 6128, succ. Centre-ville, Montréal, QC H3C 3J7

Contact Us

514 343-6111 ext. 3167

Join our mailing list

Subscription
Custom Website by webcolours.ca ©2024 | Brams - All Rights Reserved.