Elsevier

NeuroImage

Volume 218, September 2020, 116512
NeuroImage

Dynamic intersubject neural synchronization reflects affective responses to sad music

https://doi.org/10.1016/j.neuroimage.2019.116512Get rights and content
Under a Creative Commons license
open access

Highlights

  • Sad music listening was associated with intersubject synchronization in auditory, affective, and prefrontal brain regions.

  • Fantasy, a component of empathy, modulated intersubject synchronization, particularly in prefrontal regions.

  • Continuous enjoyment ratings predicted moment-to-moment synchronization in several large-scale brain networks.

  • Continuous feelings of sadness predicted moment-to-moment intersubject synchronization mainly in limbic and striatal networks.

Abstract

Psychological theories of emotion often highlight the dynamic quality of the affective experience, yet neuroimaging studies of affect have traditionally relied on static stimuli that lack ecological validity. Consequently, the brain regions that represent emotions and feelings as they unfold remain unclear. Recently, dynamic, model-free analytical techniques have been employed with naturalistic stimuli to better capture time-varying patterns of activity in the brain; yet, few studies have focused on relating these patterns to changes in subjective feelings. Here, we address this gap, using intersubject correlation and phase synchronization to assess how stimulus-driven changes in brain activity and connectivity are related to two aspects of emotional experience: emotional intensity and enjoyment. During fMRI scanning, healthy volunteers listened to a full-length piece of music selected to induce sadness. After scanning, participants listened to the piece twice while simultaneously rating the intensity of felt sadness or felt enjoyment. Activity in the auditory cortex, insula, and inferior frontal gyrus was significantly synchronized across participants. Synchronization in auditory, visual, and prefrontal regions was significantly greater in participants with higher measures of a subscale of trait empathy related to feeling emotions in response to music. When assessed dynamically, continuous enjoyment ratings positively predicted a moment-to-moment measure of intersubject synchronization in auditory, default mode, and striatal networks, as well as the orbitofrontal cortex, whereas sadness predicted intersubject synchronization in limbic and striatal networks. The results suggest that stimulus-driven patterns of neural communication in emotional processing and high-level cortical regions carry meaningful information with regards to our feeling in response to a naturalistic stimulus.

Keywords

Naturalistic stimuli
Emotion
Music
Enjoyment
Intersubject synchronization

Cited by (0)