Denoising Diffusion Probabilistic Models (DDPM) are powerful state-of-the-art methods used to generate synthetic data from high-dimensional data distributions.
This work studies DDPMs under the manifold hypothesis and proves that they achieve rates independent of the ambient dimension in terms of score learning.
In terms of sampling complexity, the rates achieved by DDPMs are independent of the ambient dimension w.r.t. the Kullback-Leibler divergence and O(sqrt(D)) w.r.t. the Wasserstein distance.
A new framework connecting diffusion models to the theory of extrema of Gaussian Processes is developed.