We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience, read our Cookie Policy

© 2018 Technology Networks, all rights reserved

How Does the Brain Represent the Objects We Touch?

Article   Aug 01, 2018

by Rongala Udaya, The BioRobotics Institute, Scuola Superiore Sant'Anna, Pisa, Italy

During infancy we perform an enormous amount of motor-babbling (random body part movements), which provides the brain with information from different sensory modalities (touch, muscle spindles, vision) from across the human body. The Brain integrates this information to form an overall sense of oneself and surrounding world. With our research, we intended to understand how the brain represents the 'touch' information. Understanding how the brain represents touch offers a better insight to the bigger question, “how does the brain work?”, and also enables development of better neuro-prosthesis and artificial intelligence for robotics.

To realize this objective, we have taken an integrated approach of combining engineering, neuro-physiology and neuro-computational modelling to create a functional artificial tactile system.

First, we used an in-house developed artificial fingertip sensor to mimic the properties of tactile afferents (sensors) present in human hand.

Second, we developed a neuron model of Cuneate Nucleus Neuron based on previous neuro-physiological studies. These neurons process the tactile information first, before transmitting them to the central nervous system. We also developed a synaptic learning rule for these neurons, based on existing hypotheses and assumptions.

Third, we modelled a biological neural networks architecture to join the artificial fingertip with the cuneate neuron and synaptic learning model.

Fourth, we ran the simulation across inputs from random textures and shapes, and let the model learn based on some ground rules.

This whole functional integration of an artificial tactile system evolved to learn the correlations in the artificial fingertip sensors and was able to identify both past and novel tactile sensory experiences.


From a neuroscience perspective, we were able to investigate what form of representations the brain could automatically learn from tactile interactions with the world. From an engineering perspective, we were able to create a robust and dynamic learning algorithm, based on random tactile experiences.

Reference:
RONGALA, U. B., Spanne, A., Mazzoni, A., Bengtsson, F., Oddo, C., & Jörntell, H. (2018). Intracellular dynamics in cuneate nucleus neurons support self-stabilizing learning of generalizable tactile representations. Frontiers in Cellular Neuroscience, 12, 210.

Acknowledgments: This works was a collective effort of all the authors from Scuola Superiore Sant’Anna, Pisa, Italy and Lund University, Sweden. This work was supported by, the Ministry of Education of the Italian Republic, the Swedish Research Council and the EU FET Grant.

RELATED ARTICLES

Three Ways to Turn Life Science Suppliers into Partners

Article

A growing number of life science businesses are turning to greater supply chain collaboration for benefits like accelerated time to market, improved quality, reduced risk and more rapid and widespread innovation. But while 68% of executives in this industry say active and meaningful engagement with suppliers is essential to success, far too many, over a third, struggle to implement it.

READ MORE

What Colour is This Shoe?

Article

Just when you thought it was safe to go back on social media, a new controversy has erupted. The question dividing opinions across the internet is now, "Do you see a pink shoe with white laces, or a grey shoe with blue laces?"

READ MORE

The Brain Bounces Back: Activity enables mice to overcome brain damage

Article

Scientists removed an area of the somatosensory cortex in mice and found the animals could recover their behavioral deficit within two days.

READ MORE

Related Content