The way that we feel has the power to change the way that we perceive the world. You only have to look at optical illusions to realise that our perception is flexible and prone to errors (anyone remember that white-gold/blue-black dress from 2015?). Our sensory perceptions do not reveal the world as it really is, rather they allow us to construct our own personal version of ‘reality’. The way that we construct this reality is affected by our biology, our emotional state, and even our past experiences.
All incoming sensory data is initially encoded by the brain as voltage changes in neurons, and these noisy, ambiguous signals have to go through many stages of processing to produce the sharp, unambiguous objects of our perceptions. One of the ways in which we can overcome the uncertainty inherent in our perceptual systems – and in the world around us – is by combining information from different senses together. This process is called multisensory integration and it can help us to build more accurate perceptions of objects in our environment. The next time you’re at a party and trying to follow your friend’s story over the background noise, you will find that it is much easier to make out what they’re saying if you look at them directly – the visual cues provided by their mouth movements will help you to more accurately decode their speech.
Understanding emotions is harder for anxious individuals
Individuals with anxiety disorders are hypersensitive to threat, and so their ‘realities’ can look a lot scarier than other people’s. My research investigates whether these differences in perception could be because anxious individuals don’t integrate multisensory information in quite the same way as other people, so they end up combining the senses together in a way that favours a more negative view of the world. I recently carried out a study where participants viewed pairs of faces and voices which showed mismatched emotions (happy and angry) and were asked to judge the actor’s emotion. Whereas individuals with low anxiety were more likely to judge the actor’s emotion based on the face, anxious individuals were more likely to base their judgement on whichever of the two cues (the face or voice) portrayed anger, and so were much more biased towards judging these ambiguous face and voice pairs as being angry. This occurred despite the fact that these same highly anxious individuals were just as accurate at perceiving happiness in individual faces and voices as the non-anxious individuals, suggesting that the differences in perception are because of how they are integrating the faces and voices together.
In the future I would like to investigate whether this multisensory bias is specific to threat-related emotions (like anger) or whether is it is a general “negativity bias” by looking at whether the same effect occurs for other emotion combinations, like happy and sad emotional cues. Multisensory processing has been studied using electroencephalography (or EEG) to illuminate the underlying brain mechanisms involved, and these studies show that multisensory integration affects both very early sensory processes (occurring less than 150ms after an object is perceived) as well as much later cognitive processes, like updating working memory. I plan to investigate the mechanisms underlying multisensory changes in anxiety by using EEG to look for changes in brain activity that represent specific perceptual and cognitive processes.
Can virtual reality help PTSD patients?
Individuals with post-traumatic-stress-disorder (PTSD) are prone to perceiving their trauma and themselves excessively negatively. These overly negative appraisals cause thoughts and feelings of danger to persist in the mind of the traumatised individual, even after the physical threat presented by the trauma has passed. Recovery from PTSD is believed to be dependent on the patient being able to mentally ‘put their trauma in the past’ so that reminders of their trauma are no longer perceived as signals of danger. I plan to investigate whether the perceptual differences we see in PTSD may be linked to altered multisensory integration, and whether we can modify the way that PTSD patients integrate multisensory information so that we can help them to ‘bring everything together’ in way that supports a more constructive perception of themselves and the world.
One of the ways that PTSD is treated is by encouraging patients to mentally ‘relive’ their trauma by exposing them to sensory sensations that remind them of their trauma. This presents the opportunity for them to incorporate new perspectives into distressing memories and provides them with an experiment to test the idea that they will ‘go crazy’ if they think about their trauma. However, it can be difficult to get people with PTSD to emotionally engage with memories of their trauma because they may use strategies like emotional numbing or thought suppression to help control their symptoms. Virtual reality exposure therapy has proven to be a successful alternative to more traditional exposure-based therapies and involves immersion in a computer-generated virtual environment. This forces the individual receiving treatment to be more immersed in their trauma memories and so makes emotional avoidance harder. Most of the virtual reality environments that have been used for this application are inherently multisensory; they usually have both visual and audio components, and sometimes even tactile and olfactory sensations are incorporated. It is my aim to work out if we can use virtual reality to modify multisensory integration and facilitate more adaptive emotional processing in PTSD. It is my hope that in the future it may be possible to ‘ground’ patients with PTSD in a world that is more reflective of ‘reality’ by immersing them in a virtual reality.