Skip to yearly menu bar Skip to main content


Virtual presentation / poster accept

Information-Theoretic Diffusion

Xianghao Kong · Rob Brekelmans · Greg Ver Steeg

Keywords: [ Unsupervised and Self-supervised learning ] [ information theory ] [ density models ] [ diffusion ]


Abstract:

Denoising diffusion models have spurred significant gains in density modeling and image generation, precipitating an industrial revolution in text-guided AI art generation. We introduce a new mathematical foundation for diffusion models inspired by classic results in information theory that connect Information with Minimum Mean Square Error regression, the so-called I-MMSE relations. We generalize the I-MMSE relations to \emph{exactly} relate the data distribution to an optimal denoising regression problem, leading to an elegant refinement of existing diffusion bounds. This new insight leads to several improvements for probability distribution estimation, including a theoretical justification for diffusion model ensembling. Remarkably, our framework shows how continuous and discrete probabilities can be learned with the same regression objective, avoiding domain-specific generative models used in variational methods.

Chat is not available.