We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Can We Decode the Language of Our Primate Cousins?

Two chimps sat facing each other.
Credit: Vasilis Caravitis/ Unsplash
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 2 minutes

Are we able to differentiate between the vocal emissions of certain primates? A team from the University of Geneva (UNIGE) asked volunteers to categorise the vocalisations of three species of great apes (Hominidae) and humans. During each exposure to these ‘‘onomatopoeia’’, brain activity was measured. Unlike previous studies, the scientists reveal that phylogenetic proximity - or kinship - is not the only factor influencing our ability to identify these sounds. Acoustic proximity - the type of frequencies emitted - is also a determining factor. These results show how the human brain has evolved to process the vocal emissions of some of our closest cousins more efficiently. Find out more in the journal Cerebral Cortex Communications.


Our ability to process verbal language is not based solely on semantics, i.e. the meaning and combination of linguistic units. Other parameters come into play, such as prosody, which includes pauses, accentuation and intonation. Affective bursts - ‘‘Aaaah!’’ or ‘‘Oh!’’ for example - are also part of this, and we share these with our primate cousins. They contribute to the meaning and understanding of our vocal communications.


When such a vocal message is emitted, these sounds are processed by the frontal and orbitofrontal regions of our brain. The function of these two areas is, among other things, to integrate sensory and contextual information leading to a decision. Are they activated in the same way when we are exposed to the emotional vocalisations of our close cousins the chimpanzees, macaques and bonobos? Are we able to differentiate between them?

MRI scans with headphones on

A UNIGE team sought to find out by exposing a group of 25 volunteers to various human and simian vocalisations. ‘‘The participants were placed in an MRI scanner and were given headphones. After a short period of familiarisation with the different types of vocalisations, each participant had to categorise them, i.e. identify to which species they belonged,’’ explains Leonardo Ceravolo, senior lecturer at the UNIGE’s Faculty of Psychology and Educational Sciences, and first author of the study.

Want more breaking news?

Subscribe to Technology Networks’ daily newsletter, delivering breaking science news straight to your inbox every day.

Subscribe for FREE
These vocalisations were of the affiliative type, i.e. linked to a positive interaction, or of the agonistic type, i.e. linked to a threat or distress. The human vocalisations came from databases recorded by actors. The simian ones came from field recordings made as part of previous research. This study is the first of its kind to include bonobo vocalisations.

Bonobos, not so close cousins

The results show that for macaque and chimpanzee vocalisations, the frontal and orbitofrontal regions of the participants were activated in a similar way to human vocalisations. The participants were able to differentiate between them easily. On the other hand, when confronted with the ‘‘sounds’’ of bonobos, also close cousins of humans, the involved cerebral areas were much less activated, and categorisation was at chance level.


‘‘It was thought that kinship between species - the ‘phylogenetic distance’ - was the main parameter for having the ability, or not, to recognise these different vocalisations. We thought that the closer we were genetically, the more important this ability was,’’ explains Didier Grandjean, full professor at the Swiss Center for Affective Sciences and at the UNIGE’s Faculty of Psychology and Educational Sciences, who led the study. ‘‘Our results show that a second parameter comes into play: acoustic distance. The further the dynamics of the acoustic parameters, such as the frequencies used, are from those of humans, the less certain frontal regions are activated. We then lose the ability to recognise these sounds, even if they are emitted by a close cousin, in this case the bonobo.”


Bonobo calls are very high-pitched and can sound like those of certain birds. This acoustic distance in terms of frequencies, compared with human vocalisations, explains our inability to decode them, despite our close phylogenetic proximity. ‘‘Are we capable of identifying the different emotional aspects of affiliative or agonistic vocalisations emitted by a chimpanzee, a macaque or a bonobo? And if so, how? Thiese questions will be at the heart of our next research, which will involve analysing not our ability to categorise vocalisations by species but to identify their emotional content,’’ concludes Didier Grandjean.


Reference: Ceravolo L, Debracque C, Pool E, Gruber T, Grandjean D. Frontal mechanisms underlying primate calls recognition by humans. Cerebral Cortex Commun. 2023;4(4):tgad019. doi: 10.1093/texcom/tgad019


This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.