We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Paying attention makes touch-sensing brain cells fire rapidly and in sync

Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 1 minute

New research brings us a step closer to cracking the code of how brains work

Whether we’re paying attention to something we see can be discerned by monitoring the firings of specific groups of brain cells. Now, new work from Johns Hopkins shows that the same holds true for the sense of touch. The study brings researchers closer to understanding how animals’ thoughts and feelings affect their perception of external stimuli.

The results have been published in the journal PLoS Biology.

“There is so much information available in the world that we cannot process it all,” says Ernst Niebur, Ph.D., a professor of neuroscience in the Johns Hopkins University School of Medicine. “Many researchers believe the brain copes with this by immediately throwing away most of what we take in — that’s called selective attention. But we need to be certain that what is thrown away is really the irrelevant part. We investigated how our neurons do that.”

Niebur, a computational biologist, worked with Steven Hsiao, Ph.D., a professor of neuroscience in Johns Hopkins’ Zanvyl Krieger Mind/Brain Institute, who died in June, on the study. Hsiao’s assistant research scientist, Manuel Gomez-Ramirez, Ph.D., trained three rhesus monkeys to pay attention to either the orientation (vertical or horizontal) or the vibration rate (fast or slow) of a pencil-shaped object using their sense of touch. The monkeys learned to move their gaze to a location on a monitor screen corresponding to the right answer and were rewarded with drops of juice or water.

Gomez-Ramirez then monitored the activity of groups of neurons and figure out which were in charge of perceiving which property. When the monkeys were paying attention to the object’s orientation, he found, the neurons for that property fired more rapidly, and more synchronously, than did neurons for the vibration rate. That much was consistent with previous studies on selective attention in vision.


In addition, the research team found, the firing rate of the neurons for the property, and how much they synced up, predicted how well the monkey did on the task — whether it at to the correct location on the monitor. But synchronization was more important to performance than was firing rate.


The results are a step toward “cracking the neural code,” he says, an ambitious goal for which his research group continues to strive. “We’re looking for the neural code of internal thought processes,” he says. “It’s a very fundamental question.”


Note: Material may have been edited for length and content. For further information, please contact the cited source.

Johns Hopkins Medicine   press release


Publication

Manuel Gomez-Ramirez, Natalie K. Trzcinski, Stefan Mihalas, Ernst Niebur, Steven S. Hsiao. Temporal Correlation Mechanisms and Their Role in Feature Selection: A Single-Unit Study in Primate Somatosensory Cortex.   PLoS Biology, Published November 25 2014. doi: 10.1371/journal.pbio.1002004