Laura T. Germine

Neural activity during social signal perception correlates with selfreported empathy

Authors:

  • Christine I. Hooker

  • Sara C. Verosky

  • Laura T. Germine

  • Robert T. Knight

  • Mark D'Esposito

Date: 2010

DOI: 10.1016/j.brainres.2009.10.006

PubMed: 19836364

View PDF

Abstract:

Empathy is an important component of human relationships, yet the neural mechanisms that facilitate empathy are unclear. The broad construct of empathy incorporates both cognitive and affective components. Cognitive empathy includes mentalizing skills such as perspective-taking. Affective empathy consists of the affect produced in response to someone else's emotional state, a process which is facilitated by simulation or "mirroring." Prior evidence shows that mentalizing tasks engage a neural network which includes the temporoparietal junction, superior temporal sulcus, and medial prefrontal cortex. On the other hand, simulation tasks engage the fronto-parietal mirror neuron system (MNS) which includes the inferior frontal gyrus (IFG) and the somotosensory related cortex (SRC). Here, we tested whether neural activity in these two neural networks was related to self-reports of cognitive and affective empathy in daily life. Participants viewed social scenes in which the shift of direction of attention of a character did or did not change the character's mental and emotional state. As expected, the task robustly activated both mentalizing and MNS networks. We found that when detecting the character's change in mental and emotional state, neural activity in both networks is strongly related to cognitive empathy. Specifically, neural activity in the IFG, SRC, and STS were related to cognitive empathy. Activity in the precentral gyrus was related to affective empathy. The findings suggest that both simulation and mentalizing networks contribute to multiple components of empathy.

Mentalizing about emotion and its relationship to empathy

Authors:

  • Christine I. Hooker

  • Sara C. Verosky

  • Laura T. Germine

  • Robert T. Knight

  • Mark D'Esposito

Date: 2008

DOI: 10.1093/scan/nsn019

PubMed: 19015112

View PDF

Abstract:

Mentalizing involves the ability to predict someone else’s behavior based on their belief state. More advanced mentalizing skills involve integrating knowledge about beliefs with knowledge about the emotional impact of those beliefs. Recent research indicates that advanced mentalizing skills may be related to the capacity to empathize with others. However, it is not clear what aspect of mentalizing is most related to empathy. In this study, we used a novel, advanced mentalizing task to identify neural mechanisms involved in predicting a future emotional response based on a belief state. Subjects viewed social scenes in which one character had a False Belief and one character had a True Belief. In the primary condition, subjects were asked to predict what emotion the False Belief Character would feel if they had a full understanding about the situation. We found that neural regions related to both mentalizing and emotion were involved when predicting a future emotional response, including the superior temporal sulcus, medial prefrontal cortex, temporal poles, somatosensory related cortices (SRC), inferior frontal gyrus and thalamus. In addition, greater neural activity in primarily emotion-related regions, including right SRC and bilateral thalamus, when predicting emotional response was significantly correlated with more self-reported empathy. The findings suggest that predicting emotional response involves generating and using internal affective representations and that greater use of these affective representations when trying to understand the emotional experience of others is related to more empathy.

Amygdala response to facial expressions reflects emotional learning

Authors:

  • Christine I. Hooker

  • Laura T. Germine

  • Robert T. Knight

  • Mark D'Esposito

Date: 2006

PubMed: 16943547

View PDF

Abstract:

The functional role of the human amygdala in the evaluation of emotional facial expressions is unclear. Previous animal and human research shows that the amygdala participates in processing positive and negative reinforcement as well as in learning predictive associations between stimuli and subsequent reinforcement. Thus, amygdala response to facial expressions could reflect the processing of primary reinforcement or emotional learning. Here, using functional magnetic resonance imaging, we tested the hypothesis that amygdala response to facial expressions is driven by emotional association learning. We show that the amygdala is more responsive to learning object-emotion associations from happy and fearful facial expressions than it is to the presentation of happy and fearful facial expressions alone. The results provide evidence that the amygdala uses social signals to rapidly and flexibly learn threatening and rewarding associations that ultimately serve to enhance survival.