The brain in someone else’s shoes – How to measure empathy

 

 

How do we know what another person is feeling? How can we tell that our friend is happy about her birthday present? How do we know our sister is sad at the family celebration?

 

We may be able to tell because our friend is smiling when handed the present and because our sister is crying at the family gathering. But what if our friend was only smiling to be polite? What if our sister was crying at a wedding, would we still think she was sad, or could the opposite be the case?

 

These simple examples demonstrate that understanding others’ feelings is not an easy task. It is even more difficult in situations where emotions mix, such as a conversation with a friend.

 

Like a super computer, we have to process a multitude of information to understand what others are feeling; information about their facial expression, behaviour, and tone of voice. Other factors such as familiarity, expressiveness or their similarity to us may also be important. While some people are really good at understanding others’ feelings, other people struggle, with most people falling somewhere in between.

 

Research in psychology and psychiatry has shown that people differ in their ability to understand and share other people’s feelings – to empathise. People with certain mental health conditions such as autism and schizophrenia often find it hard to empathise with others and as a result find it hard to predict or respond appropriately to their behaviour. But how can we tell? How can we measure how empathic someone is? How about asking people? This is actually what has been done in the majority of research studies. People are given questionnaires on which they rate their agreement to various statements, such as:

 

“When I am upset at someone, I usually try to “put myself in his shoes” for a while”

 

Most of these questionnaires have been tested in large studies and individual scores on those surveys can be compared to average scores of a population. There are however some problems with this approach. When people rate how empathic they are, we only know how empathic they think they are, but not how empathic they actually are. For instance, women generally report that they find it easier to understand other people’s feelings than men on these kinds of questionnaires. However, when we are using more objective measures of empathy, those differences tend to disappear. So, what are objective ways to measure empathy?

 

Let’s go back to that supercomputer we mentioned earlier. The human brain is actually comparable to such a high-tech machine and integrates all the different information of the environment. So, measuring brain activity during an empathy task, might provide us with some more objective information. In studies we ran at the University of Southampton and King’s College London, we developed a new empathy task suitable for studying brain activity. Going forward, this task might help us understand the differences between people with high and low levels of empathy, and patients with psychiatric or neurological conditions that lead to empathy deficits. To study brain activity, we used a method called functional magnetic resonance imaging (fMRI).

 

 

 

Conventional empathy tasks often use pictures to evoke empathy, such as an image of someone crying or laughing or a person trapping their hand in a door. However, a static picture does not capture the complexity of real life situations. For example, when having a conversation with someone, the strength of their emotions will continuously change, as will their tone of voice and their facial expressions. Our task takes all of these issues into consideration.

 

Our empathic accuracy task is based on an older task developed by Jamil Zaki and colleagues (Zaki et al., 2009) and uses videos of people recalling autobiographical experiences in which they felt strong positive or negative emotions (e.g. a sailing trip with a friend or the death of a pet). We modified this task in several ways - for instance, by adding non-emotional video clips in which people described their bedroom. When subtracting the brain activity in response to those neutral clips from the activity evoked by emotional clips, we can see which brain networks respond more to the emotional component of the videos; and not to other factors of the clip such as a person’s face or voice. We also focused each clip on a specific emotion, such as sadness or happiness to be able to study differences between the processing of different emotions.

 

In our study, which has just been published in the journal NeuroImage, healthy participants watched video clips of narrators talking about either happy or sad autobiographical events or described their bedroom (neutral) events while lying in an MRI scanner. Throughout the clip, participants used a button box to rate the emotions of the narrator while he or she was talking. Our participants would adjust their rating continuously whenever they perceived that the intensity of the emotion was going up or down. The narrators had also rated their own clips at the time of filming. By testing how well these two timelines of ratings match, we get an empathic accuracy score which, in other words, expresses how accurately our participants perceived the narrators’ emotional state. This is combined with questions about how much our participants shared the emotions of the narrators.

 

Overall, our participants performed pretty well on the task and achieved high empathic accuracy scores. They also reported sharing the same emotion as the narrators for most video clips (e.g. they would become sad while watching a sad clip).  When it came to brain activity, we found that activity of key brain regions, such as the tempo-parietal junction  and the insula, dynamically tracked changes in emotional intensity throughout the video clips. The tempo-parietal junction is a region in which people with autism have shown reduced brain activity. In other words, when participants perceived the narrator as more emotional, activity in these areas was higher too. Similar brain regions were also more active during the emotional compared to the neutral video clips. We propose that the tracking of the intensity of emotions in these brain areas might be a mechanism through which we achieve feelings of empathy for others.

 

The empathic accuracy task successfully elicits empathy and we believe it is a good way of studying the brain mechanisms that support empathy – better than previous tasks using static images of people in pain or painful facial expressions. In the long-term, we hope that this task will help researchers to gain a better understanding of the empathic difficulties that are often seen in people with psychiatric conditions such as autism or schizophrenia. Studying empathy as a dynamic process is an important step in that direction.

 

 

 

Please reload

Recent Posts

Please reload

Archive

Please reload

Tags

Please reload

King's College London, De Crespigny Park, Camberwell, London SE5 8AF, UK

©2017 BY KCL CULTURAL AND SOCIAL NEUROSCIENCE GROUP. PROUDLY CREATED WITH WIX.COM