Diffusion Models (DMs), also referred to as score-based diffusion models, utilize neural networks to specify score functions. Unlike most other probabilistic models, DMs directly model the score functions, which makes them more flexible to parametrize and potentially highly expressive for probabilistic modeling. DMs can learn fine-grained knowledge, i.e., marginal score functions, of the underlying distribution. Therefore, a crucial research direction is to explore how to distill the knowledge of DMs and fully utilize their potential. Our objective is to provide a comprehensible overview of the modern approaches for distilling DMs, starting with an introduction to DMs and a discussion of the challenges involved in distilling them into neural vector fields. We also provide an overview of the existing works on distilling DMs into both stochastic and deterministic implicit generators. Finally, we review the accelerated diffusion sampling algorithms as a training-free method for distillation. Our tutorial is intended for individuals with a basic understanding of generative models who wish to apply DM's distillation or embark on a research project in this field.
翻译:扩散模型(DM)也称为评分扩散模型,利用神经网络指定评分函数。与大多数其他概率模型不同,DM直接模拟评分函数,使其更易于参数化,并潜在地成为概率建模的高度表现形式。DM可以学习基础分布的细粒度知识,即边际评分函数,因此关键的研究方向是探索如何蒸馏DM的知识,并充分利用其潜力。我们的目标是提供一个现代方法的综合概述,用于蒸馏DM,从介绍DM开始,并讨论将其蒸馏为神经向量场所涉及的挑战。我们还概述了将DM蒸馏为随机和确定性隐式生成器的现有工作。最后,我们回顾了加速扩散采样算法作为无需训练即可蒸馏的方法。我们的指南旨在为有基本生成模型理解的人提供帮助,他们希望应用DM的蒸馏或开始在该领域进行研究项目。