We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

How Does the Brain Represent the Objects We Touch?

How Does the Brain Represent the Objects We Touch? content piece image
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 1 minute

During infancy we perform an enormous amount of motor-babbling (random body part movements), which provides the brain with information from different sensory modalities (touch, muscle spindles, vision) from across the human body. The Brain integrates this information to form an overall sense of oneself and surrounding world. With our research, we intended to understand how the brain represents the 'touch' information. Understanding how the brain represents touch offers a better insight to the bigger question, “how does the brain work?”, and also enables development of better neuro-prosthesis and artificial intelligence for robotics.

To realize this objective, we have taken an integrated approach of combining engineering, neuro-physiology and neuro-computational modelling to create a functional artificial tactile system.

First, we used an in-house developed artificial fingertip sensor to mimic the properties of tactile afferents (sensors) present in human hand.

Second, we developed a neuron model of Cuneate Nucleus Neuron based on previous neuro-physiological studies. These neurons process the tactile information first, before transmitting them to the central nervous system. We also developed a synaptic learning rule for these neurons, based on existing hypotheses and assumptions.

Third, we modelled a biological neural networks architecture to join the artificial fingertip with the cuneate neuron and synaptic learning model.

Fourth, we ran the simulation across inputs from random textures and shapes, and let the model learn based on some ground rules.

This whole functional integration of an artificial tactile system evolved to learn the correlations in the artificial fingertip sensors and was able to identify both past and novel tactile sensory experiences.


From a neuroscience perspective, we were able to investigate what form of representations the brain could automatically learn from tactile interactions with the world. From an engineering perspective, we were able to create a robust and dynamic learning algorithm, based on random tactile experiences.

Reference:
RONGALA, U. B., Spanne, A., Mazzoni, A., Bengtsson, F., Oddo, C., & Jörntell, H. (2018). Intracellular dynamics in cuneate nucleus neurons support self-stabilizing learning of generalizable tactile representations. Frontiers in Cellular Neuroscience, 12, 210.

Acknowledgments: This works was a collective effort of all the authors from Scuola Superiore Sant’Anna, Pisa, Italy and Lund University, Sweden. This work was supported by, the Ministry of Education of the Italian Republic, the Swedish Research Council and the EU FET Grant.