Which One is George?
Which One is George?
Complete the form below and we will email you a PDF version of "Which One is George? "
George is a zebrafish. Along with Tom and 98 other mates, George swims freely in a laboratory tank at the Champalimaud Centre for the Unknown (CCU), in Lisbon, Portugal. A camera records from above a video of all the animals’ comings and goings.
Question: is it possible to identify, from the video images, which fish is which at every moment? Yes, says Gonzalo de Polavieja, principal investigator of the Collective Behavior Lab, who with his team has built a software, called idtracker.ai (where “ai” stands for artificial intelligence, or AI) that can do the job quickly and with extreme accuracy. Their results were published in the journal Nature Methods.
“The ultimate goal of our team is understanding group behavior”, points out de Polavieja, whose preferred names for the fish, when he explains his work, are George and Tom. We want to understand how animals in a group decide together and learn together”. And for that, it is mandatory to extract very high quality data from the videos (such as position and shape of each of the animals, as well as their individual paths) without mistakes.
Recognizing each individual among dozens of very similar ones would be impossible to accomplish for a human brain – or for that matter, a conventional computer program. “We would just go crazy trying”, says de Polavieja. For large crowds, without artificial intelligence in the mix, even a powerful computer might have to run the program for years to get results. And these would probably not be very accurate.
That’s where idtracker.ai comes in. The new software, adds de Polavieja, supplies the quality data that is absolutely needed to be able, in a second phase, to understand the rules that drive the animals’ collective behavior.
Working in Spain four years ago, before joining the CCU, de Polavieja published, also in Nature Methods, a first version of the software, which did not resort to artificial intelligence. The results were much more modest. “We could track ten animals back then”, he says.
Video Credit: Champalimaud Research Science Communication Office
De Polavieja and his co-authors, Francisco Romero-Ferrero, Mattia Bergomi, Robert Hinz and Francisco Heras have now tested the new AI version with groups of 30, then 50, then up to 100 zebrafish. “We didn’t test more than 100 because our tank is not big enough for that”. Nonetheless, using another approach to record the images, they showed that the software can identify up to 150 individual fish with very little loss of accuracy. “I didn’t believe we could reach those numbers; it was a surprise”, notes de Polavieja. “I thought there wouldn’t be enough information in the images.”
Idtracker.ai is composed of two so-called deep-learning neural networks and a few more conventional algorithms. A deep-learning neural network is a computer simulation of real networks of neurons in the brain that are capable of learning from experience.
Using the video images of the zebrafish in the tank, the first network in the chain is trained to tell whether every little blob visible in the images corresponds to just one animal or to several.
With this output, the second neural network is then trained to assign a name (or number) to each blob that contains just one fish – in other words, to effectively identify each individual fish. The recognition is based on the unique features of each zebrafish. “People think zebrafish are all alike, but this proves they are in fact all different from each other”, notes de Polavieja.
Lastly, two conventional algorithms are used. “One is to gain some certainty about the [few individuals whose] identities are still somewhat uncertain”, says de Polavieja “and the other to know which animal is which when their paths cross” – that is, when their trajectories appear superimposed on the video.
The results speak for themselves: it takes about an hour for idtracker.ai to identify each and every one of the 100 zebrafish in the video at all times with almost 100% accuracy! “If you show the network a random part of the video it has never seen before and ask it: ‘who is this?’, the network will correctly assign the right name – or number – to that fish 99.997% of the time”, says de Polavieja. And if you ask it where George, or Tom, or any other zebrafish is at a given moment, it will find it in the crowd almost beyond the shadow of a doubt.
The team also tested the software with fruit flies, medaka fish (Japanese rice fish), ants and mice. It also works, though with smaller numbers of individuals. “Zebrafish are the best” for these studies, says de Polavieja. On the other hand, “mice are more difficult because they tend to cluster and deform”.
“This is the first time such high quality data has been obtained for 100 fish”, concludes de Polavieja. The team has now used the data supplied by idtracker.ai – which is freely available to everyone from www.idtracker.ai – to extract a set of rules that explain, in a large measure, zebrafish behavior in groups. They describe their results in another paper, which they have posted on bioarxiv.org and have submitted for publication in a scientific journal.
As for possible applications, this software and others like it allow (or will allow) for tracking persons, or to identify a given person in a crowd based on information about his or her physical appearance. “There is now a whole industry for this type of software”, says de Polavieja. “People are applying these [AI] techniques to develop other similar tracking tools. But before we proved we could do it in animals, it was hard to believe it was even remotely possible.”
This article has been republished from materials provided by Fundação Champalimaud. Note: material may have been edited for length and content. For further information, please contact the cited source.
Reference: Romero-Ferrero, F., Bergomi, M. G., Hinz, R. C., Heras, F. J. H., & Polavieja, G. G. de. (2019). idtracker.ai: tracking all individuals in small or large collectives of unmarked animals. Nature Methods, 1. https://doi.org/10.1038/s41592-018-0295-5