We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Researchers pinpoint part of the brain that recognizes facial expressions

Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 3 minutes

New machine learning algorithm can identify the facial expression a person is looking at based on neural activity -


Researchers at The Ohio State University have pinpointed the area of the brain responsible for recognizing human facial expressions.


It's on the right side of the brain behind the ear, in a region called the posterior superior temporal sulcus (pSTS).


See Also: ADHD: Brains not recognizing an angry expression


In a paper published in the Journal of Neuroscience, the researchers report that they used functional magnetic resonance imaging (fMRI) to identify a region of pSTS as the part of the brain activated when test subjects looked at images of people making different facial expressions.


Further, the researchers have discovered that neural patterns within the pSTS are specialized for recognizing movement in specific parts of the face. One pattern is tuned to detect a furrowed brow, another is tuned to detect the upturn of lips into a smile, and so on.


"That suggests that our brains decode facial expressions by adding up sets of key muscle movements in the face of the person we are looking at," said Aleix Martinez, a cognitive scientist and professor of electrical and computer engineering at Ohio State.


Martinez said that he and his team were able to create a machine learning algorithm that uses this brain activity to identify what facial expression a person is looking at based solely on the fMRI signal.


fMRI showing activity in the posterior superior temporal sulcus (pSTS) region of the brain of a test subject who is recognizing a facial expression. Courtesy: Ohio State University
 


"Humans use a very large number of facial expressions to convey emotion, other non-verbal communication signals and language," Martinez said.


"Yet, when we see someone make a face, we recognize it instantly, seemingly without conscious awareness. In computational terms, a facial expression can encode information, and we've long wondered how the brain is able to decode this information so efficiently.


"Now we know that there is a small part of the brain devoted to this task."


Learn More: Rare facial paralysis gives researchers new insights into social interaction


Using this fMRI data, the researchers developed a machine learning algorithm that has about a 60 percent success rate in decoding human facial expressions, regardless of the facial expression and regardless of the person viewing it.


"That's a very powerful development, because it suggests that the coding of facial expressions is very similar in your brain and my brain and most everyone else's brain," Martinez said.


The study doesn't say anything about people who exhibit atypical neural functioning, but it could give researchers new insights, said study co-author Julie Golomb, assistant professor of psychology and director of the Vision and Cognitive Neuroscience Lab at Ohio State.


"This work could have a variety of applications, helping us not only understand how the brain processes facial expressions, but ultimately how this process may differ in people with autism, for example," she said.


Doctoral student Ramprakash Srinivasan, Golomb and Martinez placed 10 college students into an fMRI machine and showed them more than 1,000 photographs of people making facial expressions. The expressions corresponded to seven different emotional categories: disgusted, happily surprised, happily disgusted, angrily surprised, fearfully surprised, sadly fearful and fearfully disgusted.


While some of the expressions were positive and others negative, they all had some commonalities among them. For instance, "happily surprised," "angrily surprised" and "fearfully surprised" all include raised eyebrows, though other parts of the face differ when we express these three emotions.


fMRI detects increased blood flow in the brain, so the research group was able to obtain images of the part of the brain that was activated when the students recognized different expressions. Regardless of the expression they were looking at, all the students showed increased activity in the same region—the pSTS.


Then the research group used a computer to cross-reference the fMRI images with the different facial muscle movements shown in the test photographs. They were able to create a map of regions within the pSTS that activated for different facial muscle groups, such as the muscles of the eyebrows or lips.


Don't Miss: Facial motion a clue to difficulties in social interaction among autistic adults


First, they constructed maps using the fMRIs of 9 of the participants. Then, they fed the algorithm the fMRI images from the 10th student, and asked it to identify the expressions that student was looking at. Then they repeated the experiment, creating the map from scratch with data from nine of the students, but using a different student as the 10th subject.


About 60 percent of the time, the algorithm was able to accurately identify the facial expression that the 10th person was looking at, based solely on that person's fMRI image.


Martinez called the results "very positive," and said that they indicate that the algorithm is making strides toward an understanding of what happens in that region of the brain.


Note: Material may have been edited for length and content. For further information, please contact the cited source.

Ohio State University   Original reporting by: Pam Frost Gorder