Come and meet him in person!
(Please note that the lecture will be in English)
Wednesday February 23rd, 2022, from 3:00 to 4:00 p.m.
- Université de Montréal, Pavilion Marie-Victorin, Room D-427 : Please register via a Doodle link. Due to the room capacity, only the first 22 people who register will be able to attend the conference in person. Please note that the Doodle Link to register is now closed since the maximum number of participants has been reached.
- The lecture will also be available via Zoom . No registration required. Meeting ID: 865 8851 1947 / Passcode: 409397
- The lecture will also be streaming live on Facebook. No registration required.
Hemispheric specialization for speech and music: acoustical cues and connectivity profiles
We have known that the brain’s two hemispheres function differently since the 19th century. But exactly how and why they are specialized has remained an open scientific question. In this lecture, I will focus on the idea that this phenomenon is best explained in terms of acoustical feature processing, rather than cognitive domain specificity. I will first review evidence that processing of fine frequency differences depends on the right auditory cortex, in contrast to preferential processing of speech sound in the left auditory cortex. I will then propose that this segregation of function can be explained by differential sensitivity to the most relevant acoustical features of speech and music. The left auditory cortex is specialized for processing temporal modulations, which are necessary for speech comprehension, whereas the right auditory cortex is specialized for spectral modulations, which are particularly relevant for musical comprehension. Finally, I will present evidence implicating different connectivity patterns as a physiological basis for the specialization of the left and right auditory cortices. The specialization of the brain’s hemispheres can be thought of as a biological adaptation that allows humans to have two parallel auditory communication signals, speech and music, allowing us to transmit knowledge and emotions via these two channels.
Robert Zatorre was born and raised in Buenos Aires, Argentina. He studied music and psychology at Boston University, and obtained his PhD at Brown University, followed by postdoctoral work with Brenda Milner at the Montreal Neurological Institute of McGill University, where he has been ever since, and where he currently holds a Canada Research Chair in Auditory Cognitive Neuroscience. His laboratory studies the neural substrates of auditory cognition, especially music. Together with his many students and collaborators, he has published over 300 scientific papers on topics including pitch perception, auditory imagery, music production, and brain plasticity. He is perhaps best known for discovering how the brain’s reward system results in musical pleasure. In 2006, with Prof Isabelle Peretz, he co-founded the international laboratory for Brain, Music, and Sound research (BRAMS), a unique multi-university consortium dedicated to the cognitive neuroscience of music. His work has been recognized by international prizes: including the neuronal plasticity prize from the IPSEN foundation, the Knowles prize in hearing research (Northwestern U), the C.L. de Carvalho-Heineken prize in cognitive science (Amsterdam), and the Grand Prix Scientifique from the Fondation Pour l’Audition in Paris. He lives in Montreal with his wife and close collaborator Prof. Virginia Penhune.