We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Scientists map brain's 'thesaurus' to help decode inner thoughts

Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 2 minutes

Neuroimaging reveals detailed semantic maps across human cerebral cortex -


What if a map of the brain could help us decode people's inner thoughts?


Scientists at the University of California (UC), Berkeley, have taken a step in that direction by building a "semantic atlas" that shows in vivid colors and multiple dimensions how the human brain organizes language. The atlas identifies brain areas that respond to words that have similar meanings.


See Also: First language wires brain for later language-learning


The findings, published in the journal Nature and funded by the National Science Foundation (NSF), are based on a brain imaging study that recorded neural activity while study volunteers listened to stories from "The Moth Radio Hour." They show that at least 1/3 of the brain's cerebral cortex—including areas dedicated to high-level cognition—is involved in language processing.


Notably, the study found that different people share similar language maps.


"The similarity in semantic topography across different subjects is really surprising," said study lead author Alex Huth, a postdoctoral researcher in neuroscience at UC Berkeley.


When spoken words fail

Detailed maps showing how the brain organizes different words by their meanings could eventually help give voice to those who cannot speak, such as people who have had a stroke, brain damage or motor neuron diseases such as ALS. While mind-reading technology remains far off on the horizon, charting language organization in the brain brings decoding inner dialogue a step closer to reality, the researchers said.


"This discovery paves the way for brain-machine interfaces that can interpret the meaning of what people want to express," Huth said. "Imagine a brain-machine interface that doesn't just figure out what sounds you want to make, but what you want to say."


For example, clinicians could track the brain activity of patients who have difficulty communicating and then match that data to semantic language maps to determine what their patients are trying to express. Another potential application is a decoder that translates what you say into another language as you speak.


Learn More: A dominant hemisphere for handedness and language?


"To be able to map out semantic representations at this level of detail is a stunning accomplishment," said Kenneth Whang, a program director in the NSF Information and Intelligent Systems division. "In addition, they are showing how data-driven computational methods can help us understand the brain at the level of richness and complexity that we associate with human cognitive processes."


Huth and six other native English speakers participated in the experiment, which required volunteers to remain still inside a functional magnetic resonance imaging (fMRI) scanner for hours at a time.


Each study participant's brain blood flow was measured as they listened, with eyes closed and headphones on, to more than two hours of stories from The Moth Radio Hour, a public radio show in which people recount humorous and poignant autobiographical experiences.


The participants' brain imaging data were then matched against time-coded, phonemic transcriptions of the stories. Phonemes are units of sound that distinguish one word from another.


The researchers then fed that information into a word-embedding algorithm that scored words according to how closely they are related semantically.


Charting language across the brain

The results were converted into a thesaurus-like map that arranged words on images of the flattened cortices of the left and right hemispheres of the brain. Words were grouped under various headings: visual, tactile, numeric, locational, abstract, temporal, professional, violent, communal, mental, emotional and social.


Not surprisingly, the maps show that many areas of the human brain represent language that describes people and social relations, rather than abstract concepts.


Don't Miss: Brain responses to speech predict early language outcomes in children with autism


"Our semantic models are good at predicting responses to language in several big swaths of cortex," Huth said. "But we also get the fine-grained information that tells us what kind of information is represented in each brain area. That's why these maps are so exciting and hold so much potential."


Senior author Jack Gallant, a UC Berkeley neuroscientist, said that although the maps are broadly consistent across individuals, "There are also substantial individual differences. We will need to conduct further studies across a larger, more diverse sample of people before we will be able to map these individual differences in detail."


Note: Material may have been edited for length and content. For further information, please contact the cited source.

National Science Foundation   press release


Publication

Huth AG et al. Natural speech reveals the semantic maps that tile human cerebral cortex.   Nature, Published Online April 27 2016. doi: 10.1038/nature17637