A Live Broadcast of the Brain
Complete the form below to unlock access to ALL audio articles.
Red and blue lights flash. A machine whirs like a distant swarm of bees. In a cubicle-sized room, Yoav Adam, a microscope, and a video projector capture something no one has ever seen before: neurons flashing in real time, in a walking, living creature.
For decades scientists have been searching for a way to watch a live broadcast of the brain. Neurons send and receive massive amounts of information--Toe itches! Fire hot! Garbage smells!--with impressive speed. Electrical signals can travel from cell to cell at up to 270 miles per hour.
But, neural electricity is just as hard to see as electricity in a telephone wire: To the unassisted eye, the busy brain looks as lifeless as rubber. So, to observe how neurons turn information (toe itches) into thoughts ("itching powder"), behaviors (scratching), and emotions (anger), we need to change the way we see.
A new study, published in Nature, does just that.
Adam Cohen, Professor of Chemistry and Chemical Biology and of Physics at Harvard, first author Dr. Yoav Adam, and their cross-disciplinary research team sheds literal light on the brain, transforming neural electrical signals into sparks visible through a microscope.
Tricking nature
In the 1980s, during an ecological survey of the Dead Sea, an Israeli ecologist found an organism that performs a neat trick: converting sunlight into electrical energy in a primitive form of photosynthesis. But, for almost 30 years, the organism and its talented protein (Archaerhodopsin 3) floated undisturbed in the waters of the Dead Sea.
Then, in 2010, researchers at the Massachusetts Institute of Technology (MIT) dusted the protein off, introduced it to a brain, and got the tiny tool to perform its light trick in neurons. When they trained light on the protein-enhanced brain, the tool converted the light into electricity. The researchers could then change the firing of the neurons and, if they chose well, even manipulate the animal's behavior.
Cohen was intrigued. He wondered: Could we reverse the trick? Could the protein convert the electrical activity of neurons into detectable flashes of light? After a few years of hard work, he discovered his answer: Yes. It can.
An electrical spike shoots through a single neuron in milliseconds. Neural signals can travel from cell to cell at speeds up to 270 miles per hour, but the Cohen Lab can catch them. Credit: Cohen Lab, Harvard University
Honing the tools
When illuminated with red light, Archaerhodopsin can turn voltage into light (this and similar tools are known as genetically encoded voltage indicators or GEVIs). The protein acts like an ultra-sensitive voltmeter, that, like the hair on your arm, changes with an electric jolt.
The Cohen lab pairs this protein with a similar one that, when illuminated with blue light, sparks electrical impulses in the neurons. "This way," Yoav says, "we can both control the activity of the cells and record activity of the cells." Blue light controls; red light records.
The paired proteins worked well in neurons outside the brain, in a dish. "But," Cohen says, "the holy grail was to get this to work in live mice that are actually doing something."
The elusive "holy grail" came after five years of intense, interdisciplinary collaboration between 24 neuroscientists, molecular biologists, biochemists, physicists, computer scientists, and statisticians. First, they tweaked the protein to work in live animals; then, with some adept genetic manipulation, they positioned the protein in the right part of the right cells in the mouse brain. Finally, they built a new microscope, customized with a video projector to shine a pattern of red and blue light into the live mouse's brain, and onto specific cells of interest.
"You basically make a little movie," Cohen says.
With red and blue light patterned on the brain, Yoav can control when and which neurons fire and capture their electric activity as light. To identify individual neural signals in the bright chaos, the team designed one final new tool: A software that could extract specific neural sparks, like individual fireflies from a swarm.
Clarity from Chaos
But neural signals travel far faster than fireflies. In a third of the time it takes for you to blink, the Cohen team can capture precise, intimate details of a neuron's spike pattern--like the shifting wing positions of a firefly in flight. They can record up to ten neurons at a time, a feat otherwise impossible with existing technologies, and, three weeks later, find the same exact neurons to record anew.
Yoav is not the first to record neural signals: Hair-thin, glass tubes, inserted into brain tissue, can get the job done. But, such devices record only one or two neurons at a time and, like a splinter, must be removed before causing damage. Other tools monitor calcium, which floods neurons when they fire. But, according to Cohen, "depending on exactly how you do it, it's 200 to 500 times slower than the voltage signal that Yoav is looking at."
Now, Yoav can expand his vision further and look at how behavioral changes impact neural chatter. For his first attempt, he started simple: A mouse walked on a treadmill for 15 seconds and then rested for 15. During both stages, Yoav projected blue and red light onto the hippocampus region of the brain, a hub for learning and memory.
"Even just with simple changes in behavior, walking and resting," Yoav says, "we could see robust changes in the electrical signals, which also varied between different types of neurons in the hippocampus."
"Some go faster, some go slower," Cohen adds.
Yoav also observed different types of activity patterns: Some neurons exhibited complex spikes, like the undulating Appalachian mountain range, while some shot out big, sharp peaks, like Mount Everest. Such spikes can be detected by probes outside the cellular membranes. But, Yoav can see the smaller voltage signals that ultimately determine whether a neuron spikes. These sub-threshold details have rarely been seen or studied in live animals: The right tools just didn't exist.
Next, Yoav and team will add more complexity to the mouse's treadmill environment: rough Velcro circles, whisker flicks, and a sugar station. Yoav, in particular, wants to learn more about spatial memory--for example, can the mouse remember where to find the sugar station? "No one knows what a memory really looks like," Cohen says. Soon, we might.
In the meantime, the interdisciplinary team will continue to sort through their intricate data, and improve their optical, molecular, and software tools. Better tools could capture more cells, deeper brain regions, and cleaner signals. "A mouse brain has 75 million cells in it," Cohen says, "so depending on your perspective, we've either done a lot or we still have quite a long way to go."
But Yoav, who pushed through five years of development challenges to reach their "holy grail," will keep pushing forward. To him, the end result always looked possible: "I could see the light."
This article has been republished from materials provided by Harvard University. Note: material may have been edited for length and content. For further information, please contact the cited source.
Reference: Adam, Y., Kim, J. J., Lou, S., Zhao, Y., Xie, M. E., Brinks, D., … Cohen, A. E. (2019). Voltage imaging and optogenetics reveal behaviour-dependent changes in hippocampal dynamics. Nature, 1. https://doi.org/10.1038/s41586-019-1166-7