Christine I. Hooker

Neural activity during social signal perception correlates with selfreported empathy

Authors:

  • Christine I. Hooker

  • Sara C. Verosky

  • Laura T. Germine

  • Robert T. Knight

  • Mark D'Esposito

Date: 2010

DOI: 10.1016/j.brainres.2009.10.006

PubMed: 19836364

View PDF

Abstract:

Empathy is an important component of human relationships, yet the neural mechanisms that facilitate empathy are unclear. The broad construct of empathy incorporates both cognitive and affective components. Cognitive empathy includes mentalizing skills such as perspective-taking. Affective empathy consists of the affect produced in response to someone else's emotional state, a process which is facilitated by simulation or "mirroring." Prior evidence shows that mentalizing tasks engage a neural network which includes the temporoparietal junction, superior temporal sulcus, and medial prefrontal cortex. On the other hand, simulation tasks engage the fronto-parietal mirror neuron system (MNS) which includes the inferior frontal gyrus (IFG) and the somotosensory related cortex (SRC). Here, we tested whether neural activity in these two neural networks was related to self-reports of cognitive and affective empathy in daily life. Participants viewed social scenes in which the shift of direction of attention of a character did or did not change the character's mental and emotional state. As expected, the task robustly activated both mentalizing and MNS networks. We found that when detecting the character's change in mental and emotional state, neural activity in both networks is strongly related to cognitive empathy. Specifically, neural activity in the IFG, SRC, and STS were related to cognitive empathy. Activity in the precentral gyrus was related to affective empathy. The findings suggest that both simulation and mentalizing networks contribute to multiple components of empathy.

Mentalizing about emotion and its relationship to empathy

Authors:

  • Christine I. Hooker

  • Sara C. Verosky

  • Laura T. Germine

  • Robert T. Knight

  • Mark D'Esposito

Date: 2008

DOI: 10.1093/scan/nsn019

PubMed: 19015112

View PDF

Abstract:

Mentalizing involves the ability to predict someone else’s behavior based on their belief state. More advanced mentalizing skills involve integrating knowledge about beliefs with knowledge about the emotional impact of those beliefs. Recent research indicates that advanced mentalizing skills may be related to the capacity to empathize with others. However, it is not clear what aspect of mentalizing is most related to empathy. In this study, we used a novel, advanced mentalizing task to identify neural mechanisms involved in predicting a future emotional response based on a belief state. Subjects viewed social scenes in which one character had a False Belief and one character had a True Belief. In the primary condition, subjects were asked to predict what emotion the False Belief Character would feel if they had a full understanding about the situation. We found that neural regions related to both mentalizing and emotion were involved when predicting a future emotional response, including the superior temporal sulcus, medial prefrontal cortex, temporal poles, somatosensory related cortices (SRC), inferior frontal gyrus and thalamus. In addition, greater neural activity in primarily emotion-related regions, including right SRC and bilateral thalamus, when predicting emotional response was significantly correlated with more self-reported empathy. The findings suggest that predicting emotional response involves generating and using internal affective representations and that greater use of these affective representations when trying to understand the emotional experience of others is related to more empathy.

The influence of personality on neural mechanisms of observational fear and reward learning

Authors:

  • Christine I. Hooker

  • Sara C. Verosky

  • Asako Miyakawa

  • Robert T. Knight

  • Mark D'Esposito

Date: 2008

DOI: 10.1016/j.neuropsychologia.2008.05.005

PubMed: 18573512

View PDF

Abstract:

Fear and reward learning can occur through direct experience or observation. Both channels can enhance survival or create maladaptive behavior. We used fMRI to isolate neural mechanisms of observational fear and reward learning and investigate whether neural response varied according to individual differences in neuroticism and extraversion. Participants learned object-emotion associations by observing a woman respond with fearful (or neutral) and happy (or neutral) facial expressions to novel objects. The amygdala–hippocampal complex was active when learning the object-fear association, and the hippocampus was active when learning the object-happy association. After learning, objects were presented alone; amygdala activity was greater for the fear (vs. neutral) and happy (vs. neutral) associated object. Importantly, greater amygdala–hippocampal activity during fear (vs. neutral) learning predicted better recognition of learned objects on a subsequent memory test. Furthermore, personality modulated neural mechanisms of learning. Neuroticism positively correlated with neural activity in the amygdala and hippocampus during fear (vs. neutral) learning. Low extraversion/high introversion was related to faster behavioral predictions of the fearful and neutral expressions during fear learning. In addition, low extraversion/high introversion was related to greater amygdala activity during happy (vs. neutral) learning, happy (vs. neutral) object recognition, and faster reaction times for predicting happy and neutral expressions during reward learning. These findings suggest that neuroticism is associated with an increased sensitivity in the neural mechanism for fear learning which leads to enhanced encoding of fear associations, and that low extraversion/high introversion is related to enhanced conditionability for both fear and reward learning.

Amygdala response to facial expressions reflects emotional learning

Authors:

  • Christine I. Hooker

  • Laura T. Germine

  • Robert T. Knight

  • Mark D'Esposito

Date: 2006

PubMed: 16943547

View PDF

Abstract:

The functional role of the human amygdala in the evaluation of emotional facial expressions is unclear. Previous animal and human research shows that the amygdala participates in processing positive and negative reinforcement as well as in learning predictive associations between stimuli and subsequent reinforcement. Thus, amygdala response to facial expressions could reflect the processing of primary reinforcement or emotional learning. Here, using functional magnetic resonance imaging, we tested the hypothesis that amygdala response to facial expressions is driven by emotional association learning. We show that the amygdala is more responsive to learning object-emotion associations from happy and fearful facial expressions than it is to the presentation of happy and fearful facial expressions alone. The results provide evidence that the amygdala uses social signals to rapidly and flexibly learn threatening and rewarding associations that ultimately serve to enhance survival.