ResearchOptimising Complex Networks on a Neuromorphic Computer System

14 August 2020

Researchers from Heidelberg and Göttingen study the effect of so-called critical states

Like biological systems, artificial neuronal networks distribute computations across their interconnected neurons to solve complex tasks. Scientists from Heidelberg University and the Max Planck Institute for Dynamics and Self-Organization in Göttingen studied how so-called critical states can be used to optimise these networks. They used a prototype of the BrainScaleS-2 system developed by Heidelberg physicists within the framework of the Human Brain Project. This neuromorphic computer architecture is oriented to the structure of the human brain. The results of this research were published in the journal “Nature Communications”.


Complex networks develop a number of special properties when posed at a “critical point”, a state where systems can quickly change their fundamental behaviour, transitioning e.g. between order and chaos or stability and instability. Many computational characteristics are optimised in this state, hence criticality is widely assumed to be optimal for any computation in recurrent neural networks. Circular structures give these networks a type of “memory”, allowing them to make predictions, similar to human cognition. They are therefore used in many artificial intelligence applications.

The researchers from Heidelberg and Göttingen challenged the assumption that critical states are optimal for all calculations in recurrent neuronal networks. They tested the performance of a complex network that, like the brain, encodes information in short electronic pulses, known as spikes. Their network ran on a prototype of the BrainScaleS-2 system – a second-generation neuromorphic computer architecture developed at the Kirchhoff Institute for Physics. It is based on electronic models of neuronal circuits, thus mimicking the human brain in both structure and function. To implement synaptic plasticity, which is the foundation for learning processes and memory, a special processor located directly on the BrainScaleS-2 chip was used. The researchers then altered the distance to criticality for tasks of varying complexity and evaluated the network’s performance.

In their experiments, the scientists showed that the distance to the critical point can easily be adjusted in the chip by changing the input strength. They also observed a clear relationship between criticality and system performance for the various tasks. “The general assumption that the critical state is beneficial for every calculation was not confirmed, however,” states Benjamin Cramer, the study’s primary author. In fact, only the complex, memory-intensive tasks benefitted from criticality, whereas the simple tasks actually suffered. “The results of the research therefore provide a more precise understanding of how the collective network state should be tuned to different task requirements,” adds the Heidelberg physicist, who is a member of the “Electronic Vision(s)” research group at the Kirchhoff Institute for Physics.

Mechanistically, the optimal working point for each task can be set very easily under homeostatic plasticity by adapting the mean input strength. The theory behind this mechanism was developed very recently in the group of Dr Viola Priesemann at the Max Planck Institute for Dynamics and Self-Organization in Göttingen. Applying these plasticity rules to neuromorphic hardware demonstrates that they are quite capable of tuning network dynamics to varying distances to criticality. Hence tasks of varying complexity can be solved optimally. According to Benjamin Cramer, the findings may also explain why biological neural networks do not necessarily operate at criticality but in the dynamically rich vicinity of a critical point, where they can tune their computational properties to the requirements of the task.

Original publication

Cramer, B., Stöckel, D., Kreft, M., Wibral, M., Schemmel, J., Meier, K., & Priesemann, V. Control of criticality and computation in spiking neuromorphic networks with plasticity. Nature Communications, 11(1) (2020) 2853