We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience. You can read our Cookie Policy here.

The Sound of Joy: Interface Lets People Translate Emotion Through Song

The Sound of Joy: Interface Lets People Translate Emotion Through Song

The Sound of Joy: Interface Lets People Translate Emotion Through Song

The Sound of Joy: Interface Lets People Translate Emotion Through Song

Read time:

Want a FREE PDF version of This News Story?

Complete the form below and we will email you a PDF version of "The Sound of Joy: Interface Lets People Translate Emotion Through Song"

First Name*
Last Name*
Email Address*
Company Type*
Job Function*
Would you like to receive further email communication from Technology Networks?

Technology Networks Ltd. needs the contact information you provide to us to contact you about our products and services. You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, check out our Privacy Policy

New research conducted by experts from Durham University’s Department of Music found that people are able to convey particular emotions through music by changing certain elements of the musical tune.

The researchers created an interactive computer interface called EmoteControl which allows users to control six cues (tempo, pitch, articulation, dynamics, brightness, and mode) of a musical piece in real time.

The participants were asked to show how they think seven different emotions (sadness, calmness, joy, anger, fear, power, and surprise) should sound like in music. They did this by changing the musical cues in EmoteControl, essentially allowing them to create their own variations of a range of music pieces that portrayed different emotions.

In general, musical cues were used in a similar way to represent a specific emotion. For example, participants conveyed sadness in the music using a slow tempo, minor mode, soft dynamics, legato articulation, low pitch level, and a dark timbre.

Tempo and mode were the two cues that highly effected the emotion being conveyed, while dynamics and brightness cues had the least effect on shaping the different emotions in the music.

The researchers also found out that sadness and joy were amongst the most accurately recognised emotions, which correlate with previous studies.

Professor Tuomas Eerola of Durham University said: “This interactive approach allowed us to tap into the participants’ perception of how different emotions should sound like in music and helped the participants create their own emotional variations of music that encompassed different emotional content.”

This research and the EmoteControl interface have implications for other sectors where emotional content is conveyed through music, such as sound branding (marketing), music in film and TV, adaptive music in gaming, as well as the potential to be used as an emotion communication medium for clinical purposes.

Reference: Micallef Grimaud A, Eerola T. An Interactive Approach to Emotional Expression Through Musical Cues. Music & Science. 2022;5:20592043211061744. doi:10.1177/20592043211061745

This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.