Project Details
Projekt Print View

Neuromorphic Memristive VLSI Architectures for Cognition (NMVAC)

Applicant Professor Dr. Martin Ziegler, since 4/2020
Subject Area Electronic Semiconductors, Components and Circuits, Integrated Systems, Sensor Technology, Theoretical Electrical Engineering
Term from 2020 to 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 432009531
 
Final Report Year 2025

Final Report Abstract

Neuromorphic computing aims to technically emulate the efficiency and adaptability of biological information processing systems by mimicking the computational principles of cognitive functions in customized hardware. This makes it possible to overcome existing limitations of machine learning systems. The goal is to realize cognitive systems that have generalizable capabilities and require minimal computing resources. Technically, the approach of this project is based on the development of analogue (sub-threshold) VLSI circuits in combination with memristive devices. In recent years, it has been shown that the non-volatile memory function of memristive devices enables parallelization of matrix-vector multiplications in hardware, thus enabling enormous savings in energy consumption and computing times of AI systems. However, the variability of memristive devices based on inherent stochasticity presents limitations in reliability and accuracy, making it difficult to develop more complex neuromorphic systems for real-world applications. Therefore, the aim of this project was to use the inherent stochastic of memristive devices for the development of learning processes in order to realize robust, reliable and scalable neuromorphic systems. This project has addressed these challenges through three approaches: 1. Development of suitable learning models that take into account the inherent stochastic of memristive devices and enable the implementation of cognitive learning functions within RRAM structures. 2. Development of spiking neural networks that allow memristive devices of different characteristics to be integrated within a CMOS circuit. 3. Combination of vector-symbolic architectures (VSAs) with attractor networks. VSAs allow information to be distributed evenly to all neurons of the network, while attractor networks have an emergent stability and autoassociative memory function. The approach developed here is particularly attractive for neuromorphic hardware, where size and parallelism are present, but the reliability of individual devices is not always given.

Publications

 
 

Additional Information

Textvergrößerung und Kontrastanpassung