We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement
A New Way To Train Brain-Inspired AI
News

A New Way To Train Brain-Inspired AI

A New Way To Train Brain-Inspired AI
News

A New Way To Train Brain-Inspired AI

Illustration of the on-chip classification process with the Yin-Yang dataset. Each symbol represents the spike time delay for various classifying neurons. Credit: Göltz and Kriener et al. (Heidelberg / Bern)
Read time:
 

Developing a machine that processes information as efficiently as the human brain has been a long-standing research goal towards true artificial intelligence. An interdisciplinary research team at Heidelberg University and the University of Bern (Switzerland) led by Dr Mihai Petrovici is tackling this problem with the help of biologically-inspired artificial neural networks. Spiking neural networks, which mimic the structure and function of a natural nervous system, represent promising candidates because they are powerful, fast, and energy-efficient. One key challenge is how to train such complex systems. The German-Swiss research team has now developed and successfully implemented an algorithm that achieves such training.

The nerve cells (or neurons) in the brain transmit information using short electrical pulses known as spikes. These spikes are triggered when a certain stimulus threshold is exceeded. Both the frequency with which a single neuron produces such spikes and the temporal sequence of the individual spikes are critical for the exchange of information. “The main difference of biological spiking networks to artificial neural networks is that, because they are using spike-based information processing, they can solve complex tasks such as image recognition and classification with extreme energy efficiency,” states Julian Göltz, a doctoral candidate in Dr Petrovici’s research group.

Both the human brain and the architecturally similar artificial spiking neural networks can only perform at their full potential if the individual neurons are properly connected to one another. But how can brain-inspired – that is, neuromorphic – systems be adjusted to process spiking input correctly? “This question is fundamental for the development of powerful artificial networks based on biological models,” stresses Laura Kriener, also a member of Dr Petrovici’s research team. Special algorithms are required to guarantee that the neurons in a spiking neural network fire at the correct time. These algorithms adjust the connections between the neurons so that the network can perform the required task, such as classifying images with high precision.

The team under the direction of Dr Petrovici developed just such an algorithm. “Using this approach, we can train spiking neural networks to code and transmit information exclusively in single spikes. They thereby produce the desired results especially quickly and efficiently,” explains Julian Göltz. Moreover, the researchers succeeded in implementing a neural network trained with this algorithm on a physical platform – the BrainScaleS-2 neuromorphic hardware platform developed at Heidelberg University.

According to the researchers, the BrainScaleS system processes information up to a thousand times faster than the human brain and needs far less energy than conventional computer systems. It is part of the European Human Brain Project, which integrates technologies like neuromorphic computing into an open platform called EBRAINS. “However, our work is not only interesting for neuromorphic computing and biologically inspired hardware. It also acknowledges the demand from the scientific community to transfer so-called Deep Learning approaches to neuroscience and thereby further unveil the secrets of the human brain,” emphasises Dr Petrovici.

Reference:

Göltz J, Kriener L, Baumbach A, et al. Fast and energy-efficient neuromorphic deep learning with first-spike times. Nat Mach Intell. 2021;3(9):823-835. doi:10.1038/s42256-021-00388-x

This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.

Advertisement