Denoising diffusion models have spurred significant gains in density modeling and image generation, precipitating an industrial revolution in text-guided AI art generation. We introduce a new mathematical foundation for diffusion models inspired by classic results in information theory that connect Information with Minimum Mean Square Error regression, the so-called I-MMSE relations. We generalize the I-MMSE relations to exactly relate the data distribution to an optimal denoising regression problem, leading to an elegant refinement of existing diffusion bounds. This new insight leads to several improvements for probability distribution estimation, including theoretical justification for diffusion model ensembling. Remarkably, our framework shows how continuous and discrete probabilities can be learned with the same regression objective, avoiding domain-specific generative models used in variational methods. Code to reproduce experiments is provided at http://github.com/kxh001/ITdiffusion and simplified demonstration code is at http://github.com/gregversteeg/InfoDiffusionSimple.
翻译:低温扩散模型在密度建模和图像生成方面取得了显著进展,在以文字指导的AI艺术生成中催生了工业革命。我们引入了基于信息理论经典结果的传播模型的一个新的数学基础,这种理论将信息与最小中位错误回归(即所谓的 I-MMSE 关系)联系起来。我们将I-MMSE关系概括化,将数据分配与最佳分解回归问题完全联系起来,导致对现有扩散界限进行优雅的完善。这种新的洞察导致概率分布估计的若干改进,包括扩散模型组合的理论解释。值得注意的是,我们的框架显示了如何以同样的回归目标持续和离散概率学习,避免在变异方法中使用特定域的基因模型。复制实验守则见http://github.com/kx001/ITdifulation和简化演示代码http://github.com/gregversteeg/InfoDiflumpe。