Beast Machines and Hallucination Networks: Anil Seth Discusses Consciousness at BNA 2021
Complete the form below to unlock access to ALL audio articles.
The basis of our consciousness has perplexed experienced researchers, esteemed thinkers and the rest of humanity for several thousand years. Advances in neuroscience have given us a uniquely analytical approach to understanding consciousness. At the British Neuroscience Association’s 2021 Festival of Neuroscience, University of Sussex Professor Anil Seth used his plenary lecture to dive deeper into new perspectives in our scientific understanding of consciousness.
The “real problem” of consciousness
Seth began his lecture by examining researcher David Chalmers’s focus on the “hard” and “easy” problems of consciousness.
Chalmers’s “hard” problem identified our fundamental ignorance about why our brain’s physical processing should give rise to our rich experience of the world. “It seems objectively unreasonable that it should,” said Chalmers, “and yet it does.”
Chalmers contrasted this metaphysical mindbender with his “easy” problems of consciousness. These questions are not “easy” because they are simple to answer, but rather because we at least have a definite scientific route that could lead to an answer. They revolve around understanding various process that Seth places “in the vicinity of consciousness”. How do we perceive the world? How does cognition work? These can be solved, Seth pointed out, without necessarily needing to engage with the deeper question of why these processes should amalgamate into our conscious perception of the world.
In between these problems, Seth placed what he called, the “real” problem of consciousness. “My take on this is a more pragmatic approach,” said Seth. “The real problem can be expressed very simply. How can mechanisms and processes in the brain and the body explain, predict and control properties of consciousness?”
This approach aims to “dissolve” rather than “solve” the hard problem, said Seth, by breaking the “big, scary mystery” of consciousness down into smaller problems that researchers can attempt to experimentally answer. Seth focused on two of these smaller problems in his plenary – how can we explain the content of our consciousness and how can we explain the experience of conscious self?
Why do we experience the world in the way we do?
With the aim of answering these questions, Seth explored a key concept in contemporary consciousness research that aims to solve the “real problem” of perception: the idea of the brain as a “prediction machine”.
Seth suggested that the brain, trapped in our skull, spends its time trying to guess what all the electrical signals that arrive from our sensory organs mean. This “prediction engine” constantly creates our perception by balancing out predictions and prediction errors: if the brain’s prediction of our world is mistaken, the resulting sensory information then helps it update its predictions, bringing it closer and closer to the brain’s “best guess” perception of the world.
This focus means that subsequent experimentation can sidestep the “hard” questions of what such perception might say about our place in the universe and get on with answering practical queries about how we see the world. One fascinating aspect of these experiments, Seth explained, was a project he is currently working on to create computational models of hallucinations.
The hallucination network
Hallucinations, Seth said, vary widely in their complexity, spontaneity and how realistic their content appears and feels. Seth and colleagues at the University of Sussex are investigating hallucinations in an ongoing study that uses a pair of computational networks. One is a perceptual engine, a network capable of working out what a given image is based on previous data fed into it, and the other a generator network tasked with reproducing images presented to it.
By running the perception engine backwards and the generator network forwards, the team were able to create simulations of visual hallucinations. By playing around with the engine’s settings, Seth’s team could mimic the complex neurological hallucinations seen in Lewy body dementia and psychedelic images. The takeaway from this research, said Seth, was that “if we can think of hallucination as uncontrolled perception, where the brain’s best guesses are not reined in as they normally are by sensory stimulation from the world, then we can think of perception in the here and now as controlled hallucinations.”
Being a body
Seth then looked at the experience of being a conscious self. To start with, he summarily dismissed the phenomenological interpretation that our conscious selves are generated in the act of receiving signals from the outside world. Instead, Seth proposed that both our experience of the world and of our self are forms of perception, created in the same fashion.
One aspect of selfhood that Seth explored was the concept of “being a body” rather than just “having a body” – Seth called this “the shapeless, formless experience of just being a living organism, which could be the very base of conscious selfhood.” Linked to this is enteroception, our perception of sensations from within our bodies, like hunger or a racing heart. Seth said that enteroception is a way of controlling variables in our body, such as our blood pressure or heart rate, so that they remain within limits that allow us to survive.
In this way, Seth argued that perception and self were reflections of the same prediction engine – our brains use vision to predict our perceptual experience and our enteroception to predict our embodied experience such as our emotions and moods. Seth suggested that ultimately all our perceptual experience is grounded in a biological drive to stay alive. “We perceive the world around us, and ourselves within it, with, through and because of our living bodies,” said Seth.
Concluding, Seth explained why he had entitled his talk “Real Problems and Beast Machines.” This latter term refers to French philosopher Rene Descartes’s idea that animals, lacking minds, were “unthinking, unfeeling machines that move like clockwork." Seth suggested that the opposite may in fact be true – “Our conscious selfhood emerges because of, and not in spite of, our beast machine nature.”