Massages Stones Help Reveal How the Prefrontal Cortex Integrates Our Senses
Complete the form below to unlock access to ALL audio articles.
An image of a beautiful beach conjures up certain sensations – one can imagine the warmth of the sun as it caresses the skin, and the sound of the water as waves break on the shore. But how is it that the human brain produces these impressions even when an individual isn’t actually standing on a beach, basking in the sun’s rays and listening to the sound of the waves?
Scientists at the University of Toronto (U of T) exploring this mystery found that the brain’s prefrontal cortex – a region known primarily for its role in regulating behaviour, impulse inhibition, and cognitive flexibility – produces such general sensations based on information provided by various senses. The findings provide new insights into the poorly understood role of the prefrontal cortex in human perception.
Using a combination of photographs, sounds and even heated massage stones, the researchers investigated patterns of neural activity in the prefrontal cortex as well as the other regions of the brain known to be responsible for processing stimulation from all the senses and found significant similarities.
“Whether an individual was directly exposed to warmth, for example, or simply looking at a picture of a sunny scene, we saw the same pattern of neural activity in the prefrontal cortex,” said Dirk Bernhardt-Walther, professor in the Department of Psychology in the Faculty of Arts & Science at the U of T, and coauthor of a study published in the Journal of Neuroscience describing the findings. “The results suggest that the prefrontal cortex generalizes perceptual experiences that originate from different senses.”
To understand how the human brain processes the torrent of information from the environment, researchers often study the senses in isolation, with much of prior work focused on the visual system. Bernhardt-Walther says that while such work is illuminating and important, it is equally important to find out how the brain integrates information from the different senses, and how it uses the information in a task-directed manner. “Understanding the basics of these capabilities provides the foundation for research of disorders of perception,” he said.
Using functional magnetic resonance imaging (fMRI) technology to capture brain activity the researchers conducted two experiments with the same participants, based on knowing how regions of the brain respond differently depending on the intensity of stimulation.
In the first, the participants viewed a series of images of various scenes – including beaches, city streets, forests, and train stations – and were asked to judge if the scenes were warm or cold and noisy or quiet. Throughout, neural activity across several regions of the brain was tracked.
In the second experiment, participants were first handed a series of massage stones that were either heated to 45℃ or cooled to 9℃, and later exposed to sounds both quiet and noisy – such as birds, people, and waves at a beach.
“When we compared the patterns of activity in the prefrontal cortex, we could determine temperature both from the stone experiment and from the experiment with pictures as the neural activity patterns for temperature were so consistent between the two experiments,” said lead author of the study Yaelan Jung, who recently completed her PhD at U of T working with Bernhardt-Walther and is now a postdoctoral researcher at Emory University.
“We could successfully determine whether a participant was holding a warm or a cold stone from patterns of brain activity in the somatosensory cortex, which is the part of the brain that receives and processes sensory information from the entire body, while brain activity in the visual cortex told us if they were looking at an image of a warm or cold scene,” said Jung.
The patterns were so compatible that a decoder trained on prefrontal brain activity from the stone experiment was able to predict the temperature of a scene depicted in an image as it was viewed.
“It tells us about the relationship between someone feeling warmth by looking at a picture versus actually touching a warm object,” Jung said.
Similarly, the researchers could decode noisy versus quiet sounds from the brain’s auditory cortex and pictures of noisy versus quiet scenes from the visual cortex.
“Overall, the neural activity patterns in the prefrontal cortex produced by participants viewing the images were the same as those triggered by actual experience of temperature and noise level,” said Jung.
The researchers suggest the findings may open a new avenue to study how the brain manages to process and represent complex real-world attributes that span multiple senses, even without directly experiencing them.
“In understanding how the human brain integrates information from different senses into higher-level concepts, we may be able to pinpoint the causes of specific inabilities to recognize particular kinds of objects or concepts,” said Bernhardt-Walther.
“Our results might help people with limitations in one sensory modality to compensate with another and reach the same or very similar conceptual representations in their prefrontal cortex, which is essential for making decisions about their environment.”
Jung Y, Walther DB. Neural Representations in the Prefrontal Cortex Are Task Dependent for Scene Attributes But Not for Scene Categories. J Neurosci. 2021;41(34):7234-7245. doi:10.1523/JNEUROSCI.2816-20.2021
This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.