Project Details
SPP 2298: Theoretical Foundations of Deep Learning
Subject Area
Mathematics
Computer Science, Systems and Electrical Engineering
Materials Science and Engineering
Medicine
Physics
Computer Science, Systems and Electrical Engineering
Materials Science and Engineering
Medicine
Physics
Term
since 2021
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 441826958
Parallel to the impressive success of deep learning in real-world applications ranging from autonomous driving to gaming intelligence and healthcare, deep learning-based methods are now also making a strong impact in science, replacing or complementing state-of-the-art classical model-based methods in solving mathematical problems such as inverse problems or partial differential equations. However, despite the outstanding successes, most of the research on deep neural networks is empirically driven and their theoretical-mathematical foundations are largely lacking. The main goal of this priority program is to develop a comprehensive theoretical foundation of deep learning. Research within the program will be structured along three complementary viewpoints, namely (1) the statistical perspective, which views neural network training as a statistical learning problem and investigates expressivity, learning, optimization, and generalization, (2) the application perspective, which focuses on security, robustness, interpretability, and fairness, and (3) the mathematical-methodological perspective, which develops and theoretically analyzes novel Deep Learning-based approaches to solving inverse problems and partial differential equations. The research questions to be addressed in this priority program are to a large extent interdisciplinary in nature and can only be solved by a joint effort of mathematics and computer science. Mathematical methods and concepts from all areas of mathematics are required, including algebraic geometry, analysis, stochastics, approximation theory, differential geometry, discrete mathematics, functional analysis, optimal control, optimization, and topology. Statistics and theoretical computer science also play a fundamental role. In this sense, methods from mathematics, statistics and computer science form the core of this priority program.
DFG Programme
Priority Programmes
International Connection
Canada, France, United Kingdom
Projects
- Adaptive Neural Tensor Networks for parametric PDEs (Applicants Eigel, Martin ; Grasedyck, Lars )
- Assessment of Deep Learning through Meanfield Theory (Applicant Herty, Michael )
- Combinatorial and implicit approaches to deep learning (Applicant Montúfar, Guido )
- Coordination Funds (Applicant Kutyniok, Gitta )
- Curse-of-dimensionality-free nonlinear optimal feedback control with deep neural networks. A compositionality-based approach via Hamilton-Jacobi-Bellman PDEs (Applicant Grüne, Lars )
- Deep assignment flows for structured data labeling: design, learning and prediction performance (Applicant Schnörr, Christoph )
- Deep-Learning Based Regularization of Inverse Problems (Applicants Burger, Martin ; Kutyniok, Gitta )
- Deep learning for non-local partial differential equations (Applicants Jentzen, Arnulf ; Kutyniok, Gitta )
- Deep neural networks overcome the curse of dimensionality in the numerical approximation of stochastic control problems and of semilinear Poisson equations (Applicants Hutzenthaler, Martin ; Kruse, Thomas )
- Foundations of Supervised Deep Learning for Inverse Problems (Applicants Burger, Martin ; Möller, Michael )
- Globally Optimal Neural Network Training (Applicants Pfetsch, Marc Emanuel ; Pokutta, Sebastian )
- Implicit Bias and Low Complexity Networks (iLOCO) (Applicants Fornasier, Massimo ; Rauhut, Holger )
- Multi-Phase Probabilistic Optimizers for Deep Learning (Applicant Hennig, Philipp )
- Multilevel Architectures and Algorithms in Deep Learning (Applicants Herzog, Roland ; Schiela, Anton )
- Multiscale Dynamics of Neural Nets via Stochastic Graphops (Applicants Engel, Maximilian ; Kühn, Ph.D., Christian )
- On the Convergence of Variational Deep Learning to Sums of Entropies (Applicants Fischer, Asja ; Lücke, Jörg )
- Provable Robustness Certification of Graph Neural Networks (Applicant Günnemann, Stephan )
- Solving linear inverse problems with end-to-end neural networks: expressivity, generalization, and robustness (Applicants Heckel, Reinhard ; Krahmer, Ph.D., Felix )
- Statistical Foundations of Unsupervised and Semi-supervised Deep Learning (Applicant Ghoshdastidar, Ph.D., Debarghya )
- Structure-preserving deep neural networks to accelerate the solution of the Boltzmann equation (Applicant Frank, Martin )
- The Data-dependency Gap: A New Problem in the Learning Theory of Convolutional Neural Networks (Applicant Kloft, Marius )
- Towards a Statistical Analysis of DNN Training Trajectories (Applicant Steinwart, Ingo )
- Towards everywhere reliable classification - A joint framework for adversarial robustness and out-of-distribution detection (Applicant Hein, Matthias )
- Understanding Invertible Neural Networks for Solving Inverse Problems
Spokesperson
Professorin Dr. Gitta Kutyniok