We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Learning impacts how the brain processes what we see

Learning impacts how the brain processes what we see content piece image
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 1 minute

From the smell of flowers to the taste of wine, our perception is strongly influenced by prior knowledge and expectations, a cognitive process known as top-down control.


In a University of California, San Diego School of Medicine study published in the journal Nature Neuroscience, a research team led by Takaki Komiyama, PhD, assistant professor of neurosciences and neurobiology, reports that in mouse models, the brain significantly changed its visual cortex operation modes by implementing top-down processes during learning.


“We found that when the mouse assigns a new meaning to a previously neutral visual stimulus, top-down control becomes much more influential in activating the visual cortex,” said first author Hiroshi Makino, PhD, postdoctoral researcher in Komiyama’s lab. “Top-down inputs interact with specific neuron types in the visual cortex to modulate its operation modes.”


This cognitive process uses our thoughts and influences our senses. For example, when we see a word with missing letters, our brain is able to fill in the blank based on past experiences.


Researchers looked at activity in excitatory neurons and somatostatin-expressing inhibitory neurons in the visual cortex and top-down inputs from the retrosplenial cortex (RSC) during associative learning to see how these affected the top-down and bottom-up processing—when perception begins with the senses.


The findings indicate that intricate interactions of various circuit components effectively change the balance of top-down and bottom-up processing, with learning enhancing the contribution of top-down control. These results support the long-standing theory that the brain does not faithfully represent the environment but rather attempts to predict it based upon prior information.


“In addition to revealing circuit mechanisms underlying these learning-related changes, our findings may have implications in understanding the pathophysiology of psychiatric diseases, such as schizophrenia, that generate abnormal perception,” said Makino.


Note: Material may have been edited for length and content. For further information, please contact the cited source.

University of California, San Diego   Original reporting by: Yadira Galindo


Publication

Hiroshi Makino & Takaki Komiyama Learning enhances the relative impact of top-down processing in the visual cortex.   Nature Neuroscience, Published Online July 13 2015. doi: 10.1038/nn.4061