Project Details
Projekt Print View

Accelerating Diffusion Models Through Sparse Neural Networks

Subject Area Mathematics
Term since 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 543964668
 
Mathematical convergence guarantees for diffusion models have become an active direction of research recently. Studies have revealed non-asymptotic convergence rates for the total variation between generated and original samples. These rates grow with O(d/epsilon), that is, have a clear polynomial dependence on the data’s dimensionality "d" and the error level "epsilon". Our goal is to accelerate these rates of convergence with respect to "d". We aim at rates essentially of the order O(s/epsilon), where the effective dimension "s" is much smaller than "d". For example, we can think of "s" as the level of sparsity of the estimator. To achieve this, we develop two distinct approaches: First, we consider target distributions that lie on a low-dimensional manifold embedded in a high-dimensional ambient space. In this setup, we use the estimated score function, especially the information about the low-dimensional manifold stored in this function, to expedite the sampling process. Second, we consider target scores that can be approximated by a sparse neural network. In this setup, we employ regularization techniques from high-dimensional statistics to expedite the sampling processes of diffusion models. After establishing these mathematical results, we connect them to the computational challenges of minimizing the score-matching objectives.
DFG Programme Priority Programmes
 
 

Additional Information

Textvergrößerung und Kontrastanpassung