We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Virtual Reality Makes Food Tastier for Flies

Virtual Reality Makes Food Tastier for Flies content piece image
From edible to irresistible: Creating virtual taste realities with light. Credit: Diogo Matias.
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 3 minutes

The fly hasn’t eaten for an entire day and it’s starving. Finally, it finds a pile of edible gelatinous goo. It begins eating when suddenly a green light appears, and the food, which was far from delicious a moment ago, becomes irresistibly sweet. The fly, excited by the sudden improvement, eats with increased vigor. But its enthusiasm quickly wanes when the green light disappears and the flavour of the food reverts to its original blandness.


Reflecting on such an unusual situation, one might guess that it was the result of an added sweetener, or some sort of temporary delusion. But no, the answer lies within the realm of what might have been considered until recently to be science fiction. 


“The fly’s experience was very real. It was a virtual taste created by directly manipulating its taste neurons”, says Carlos Ribeiro, head of the behavior and Metabolism lab at the Champalimaud Centre for the Unknown in Lisbon, Portugal. Together with his team, Ribeiro developed the optoPAD: a system that creates “virtual taste realities”, in a way that can be flexibly paired with the fly’s behavior. They describe this new technology in a scientific article published in the journal elife.       


Creating virtual taste realities

The optoPAD combines two high-tech elements: the first is optogenetics, a powerful method that uses light to control the activity of neurons (quite literally to turn them “on” or “off”). For instance, the fly mentioned earlier was briefly enjoying more appetizing food because its sweet-sensing neurons were optogenetically activated by exposure to green light.


The second element of the optoPAD is an additional system, previously developed in the lab, called flyPAD. “The flyPAD uses touchscreen-type technology to monitor the fly’s feeding behavior. Just like your phone is able to detect the touch of your finger on the screen, flyPAD is able to detect whenever the fly touches the food”, explains José-Maria Moreira, one of the leading co-authors of study.  


By combining flyPAD with optogenetics, the researchers were able to overcome one of the main challenges in the field of feeding research: precisely controlling taste sensations. 


Unlike auditory or visual information, which can be altered instantaneously and independently of the animal’s behavior, animals only experience taste information when they voluntarily touch the food with their tongue, or proboscis (in the case of the fly). “With optoPAD, we are constantly monitoring the behavior of the fly, to ensure that we optogenetically change the taste of the food precisely when the fly is in contact with it”, Moreira explains.


Taste and beyond

In this study, which shows that the optoPAD is able to effectively pair active feeding with optogenetic manipulations, the researchers demonstrate that these virtual tastes have a very real effect on the behavior of the flies. 


For instance, they are able to make the fly eat excessively by optogenetically activating sweet-sensing neurons; or make the fly stop eating all together, regardless of how hungry it is, by optogenetically activating bitter-sensing neurons.


For the researchers, manipulating taste was a good beginning, but it wasn’t enough. “We developed the optoPAD, because we are interested in understanding how the brain makes one of the most fundamental decisions for our health: what food to eat”, says Dennis Goldschmidt, another leading co-author, “but food choices do not only depend on taste, many parts of the brain are involved, so we wanted to ensure that optoPAD can be used to study the activity of neurons anywhere”.    


Since taste neurons are located in the mouth of the fly, which makes them easily accessible to the light needed for manipulation, the team chose a more difficult target: neurons at the center of the brain that are involved in jumping reactions. 


The results were clear: “as we expected, optogenetic stimulation of these ‘jumping’ neurons, made the flies jump and stop feeding, showing that indeed, we can study any neuron, regardless of its location, in order to understand its role in the brain’s feeding circuitry”, says Goldschmidt.   


Next steps 

Though optoPAD seems like a fantastic way to improve one’s nutrition without compromising taste, the researchers’ goal is to use this technology to improve human life in a more fundamental way. “The food we eat affects all aspects of our lives, including aging, ability to reproduce, lifespan, mental state and mood”, says Ribeiro. “Yet, how the brain controls food choice is still a mystery. The optoPAD can help us identify the neurons and genes that may have a direct impact on nutrition and hence our well-being in years to come.” 


The team is now gearing up to start a series of new experiments, and they are already freely sharing this exciting new technology with the scientific community by making all the blueprints and software freely available here: http://ribeirolab.org/optopad/. “We expect that the flexibility of the optoPAD will allow researchers not only to study feeding behavior but to also explore how flies adapt their behavior to complex environmental features, which will in turn may lead to the identification of novel neuronal circuits and computations”, Ribeiro concludes.

Reference: Moreira, J.-M., Itskov, P. M., Goldschmidt, D., Baltazar, C., Steck, K., Tastekin, I., … Ribeiro, C. (2019). optoPAD, a closed-loop optogenetics system to study the circuit basis of feeding behaviors. ELife, 8, e43924. https://doi.org/10.7554/eLife.43924

This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.