We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement
What We Hear Affects Our Sense of Touch
News

What We Hear Affects Our Sense of Touch

What We Hear Affects Our Sense of Touch
News

What We Hear Affects Our Sense of Touch

Read time:
 

Want a FREE PDF version of This News Story?

Complete the form below and we will email you a PDF version of "What We Hear Affects Our Sense of Touch"

First Name*
Last Name*
Email Address*
Country*
Company Type*
Job Function*
Would you like to receive further email communication from Technology Networks?

Technology Networks Ltd. needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, check out our Privacy Policy

Researchers at the University of East Anglia have made an important discovery about the way our brains process the sensations of sound and touch.


A new study published today shows how the brain’s different sensory systems are all closely interconnected – with regions that respond to touch also involved when we listen to specific sounds associated with touching objects.


They found that these areas of the brain can tell the difference between listening to sounds such as such as a ball bouncing, or the sound of typing on a keyboard.


It is hoped that understanding this key area of brain function may in future help people who are neurodiverse, or with conditions such as schizophrenia or anxiety. And it could lead to developments in brain-inspired computing and AI.


Lead researcher Dr Fraser Smith, from UEA’s School of Psychology, said: “We know that when we hear a familiar sound such as a bouncing a ball, this leads us to expect to see a particular object. But what we have found is that it also leads the brain to represent what it might feel like to touch and interact with that object.


“These expectations can help the brain process sensory information more efficiently.”


The research team used an MRI scanner to collect brain imaging data while 10 participants listened to sounds generated by interacting with objects - such as bouncing a ball, knocking on a door, crushing paper, or typing on a keyboard.


Using a special imaging technique called functional MRI (fMRI), they measured brain activity throughout the brain.


They used sophisticated machine learning analysis techniques to test whether the activity generated in the earliest touch areas of the brain (primary somatosensory cortex) could tell apart sounds generated by different types of object interaction (bouncing a ball, verses typing on a keyboard).


They also performed a similar analysis for control sounds, similar to those used in hearing tests, to rule out that just any sounds can be discriminated in this brain region.


Researcher Dr Kerri Bailey said: “Our research shows that parts of our brains, which were thought to only respond when we touch objects, are also involved when we listen to specific sounds associated with touching objects.


“This supports the idea that a key role of these brain areas is to predict what we might experience next, from whatever sensory stream is currently available.


Dr Smith added: “Our findings challenge how neuroscientists traditionally understand the workings of sensory brain areas and demonstrate that the brain’s different sensory systems are actually all very interconnected.


“Our assumption is that the sounds provide predictions to help our future interaction with objects, in line with a key theory of brain function – called Predictive Processing.


“Understanding this key mechanism of brain function may provide compelling insights into mental health conditions such as schizophrenia, autism or anxiety and in addition, lead to developments in brain-inspired computing and AI.”


Reference: Bailey KM, Giordano BL, Kaas AL, Smith FW. Decoding sounds depicting hand-object interactions in primary somatosensory cortex. bioRxiv. Published online January 1, 2022:732669. doi:10.1101/732669


This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.

Advertisement