Researchers Discover Color-Specific Brain Activity Patterns
Complete the form below to unlock access to ALL audio articles.
Researchers at the National Eye Institute (NEI) have decoded brain maps of human color perception. The findings, published today in Current Biology, open a window into how color processing is organized in the brain, and how the brain recognizes and groups colors in the environment. The study may have implications for the development of machine-brain interfaces for visual prosthetics. NEI is part of the National Institutes of Health.
“This is one of the first studies to determine what color a person is seeing based on direct measurements of brain activity,” said Bevil Conway, Ph.D., chief of NEI’s Unit on Sensation, Cognition and Action, who led the study. “The approach lets us get at fundamental questions of how we perceive, categorize, and understand color.”
The brain uses light signals detected by the retina’s cone photoreceptors as the building blocks for color perception. Three types of cone photoreceptors detect light over a range of wavelengths. The brain mixes and categorizes these signals to perceive color in a process that is not well understood.
To examine this process, Isabelle Rosenthal, Katherine Hermann, and Shridhar Singh, post-baccalaureate fellows in Conway’s lab and co-first authors on the study, used magnetoencephalography or “MEG,” a 50-year-old technology that noninvasively records the tiny magnetic fields that accompany brain activity. The technique provides a direct measurement of brain cell activity using an array of sensors around the head. It reveals the millisecond-by-millisecond changes that happen in the brain to enable vision. The researchers recorded patterns of activity as volunteers viewed specially designed color images and reported the colors they saw.
The researchers worked with pink, blue, green, and orange hues so that they could activate the different classes of photoreceptors in similar ways. These colors were presented at two luminance levels – light and dark. The researchers used a spiral stimulus shape, which produces a strong brain response.
The researchers found that study participants had unique patterns of brain activity for each color. With enough data, the researchers could predict from MEG recordings what color a volunteer was looking at – essentially decoding the brain map of color processing, or “mind-reading.”
“The point of the exercise wasn’t merely to read the minds of volunteers,” Conway said. “People have been wondering about the organization of colors for thousands of years. The physical basis for color—the rainbow—is a continuous gradient of hues. But people don’t see it that way. They carve the rainbow into categories and arrange the colors as a wheel. We were interested in understanding how the brain makes this happen, how hue interacts with brightness, such as to turn yellow into brown.”
As an example, in a variety of languages and cultures, humans have more distinct names for warm colors (yellows, reds, oranges, browns) than for cool colors (blues, greens). It’s long been known that people consistently use a wider variety of names for the warm hues at different luminance levels (e.g. “yellow” versus “brown”) than for cool hues (e.g. “blue” is used for both light and dark). The new discovery shows that brain activity patterns vary more between light and dark warm hues than for light and dark cool hues. The findings suggest that our universal propensity to have more names for warm hues may actually be rooted in how the human brain processes color, not in language or culture.
“For us, color is a powerful model system that reveals clues to how the mind and brain work. How does the brain organize and categorize color? What makes us think one color is more similar to another?” said Conway. “Using this new approach, we can use the brain to decode how color perception works – and in the process, hopefully uncover how the brain turns sense data into perceptions, thoughts, and ultimately actions.”
The study was funded by the NEI Intramural Program.
This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.