Project Details
Projekt Print View

Recurrent neural computing controlled by adaptive noise and regulatory feedback

Applicant Dr. Patrick Krauss
Subject Area Experimental and Theoretical Network Neuroscience
Term since 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 542747151
 
The objective of this project is to explore the potential of recurrent neural networks (RNNs) as highly dynamic and adaptive systems, and the role of feedback mechanisms, including noise, in optimizing their computational power. While current machine learning algorithms focus primarily on feed-forward architectures, the neural networks of the cerebral cortex are highly recurrent in nature and can be active even without external input, constituting dynamical systems that fall into fix point, periodic, or chaotic attractors. The computational power of RNNs depends critically on the type of dynamical attractor the system is in, with the 'Edge Of Chaos' appearing as the optimal region in phase space. The connection structure of an RNN determines its functional properties, and the dynamical regime can be adjusted by tailoring the statistics of neural connections. However, even if a proper regime is guaranteed for the free-running RNN without input, temporally varying input signals may drive the system out of the computational optimum. To address this issue, a closed-loop feedback circuit, similar to the automatic gain control (AGC) in electronic amplifiers, can restore the computational optimum quickly and reliably by sending noise or adaptive control signals into the RNN. Recent theoretical and experimental results suggest that the brain may use such closed-loop feedback circuits to optimize information processing. For example, the auditory system may feed an adaptive level of noise into the second synapse of the auditory pathway, based on the phenomenon of Stochastic Resonance (SR), to enable transmission of very weak signals to the higher processing stages of the brain. Adding an optimal level of noise to a free-running RNN can enhance the recurrent flux of information within the network, a novel effect called 'Recurrence Resonance' (RR) which could be used by the brain to regulate the 'decay time' of information in a network and might even be the basis of short-term memory. The computational power of RNNs in reservoir computing, based on randomly connected RNNs, depends on four critical conditions: dimensionality, non-linearity, memory, and separation. Noise is a key factor in the adaptive regulation of working conditions that is ubiquitous in biological systems, and its role in optimizing the computational power of RNNs is an important area of investigation. The potential of RNNs and feedback mechanisms for enhancing machine learning algorithms is a promising avenue for future research.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung