We introduce Autoregressive Diffusion Models (ARDMs), a model class encompassing and generalizing order-agnostic autoregressive models (Uria et al., 2014) and absorbing discrete diffusion (Austin et al., 2021), which we show are special cases of ARDMs under mild assumptions. ARDMs are simple to implement and easy to train. Unlike standard ARMs, they do not require causal masking of model representations, and can be trained using an efficient objective similar to modern probabilistic diffusion models that scales favourably to highly-dimensional data. At test time, ARDMs support parallel generation which can be adapted to fit any given generation budget. We find that ARDMs require significantly fewer steps than discrete diffusion models to attain the same performance. Finally, we apply ARDMs to lossless compression, and show that they are uniquely suited to this task. Contrary to existing approaches based on bits-back coding, ARDMs obtain compelling results not only on complete datasets, but also on compressing single data points. Moreover, this can be done using a modest number of network calls for (de)compression due to the model's adaptable parallel generation.
翻译:我们引入了自动递减扩散模型(ARDMs),这是一个模型类,包含和普及秩序、不可逆自动递减模型(Uria等人,2014年),并吸收离散扩散模型(Austin等人,2021年),我们在轻度假设下显示,这是ARDM的特殊案例。ARDMs简单易实施,易于培训。与标准的ARDMs不同,它们不需要因果掩蔽模型的表示方式,可以使用与现代概率扩散模型相类似的高效目标进行培训,而现代概率扩散模型对高度维度数据而言是有利的。在测试时,ARDMs支持平行生成,可以适应任何特定一代预算。我们发现ARDMs所需要的步骤比离散扩散模型要少得多。最后,我们将ARDMs应用于无损压缩,并表明它们与这项工作完全相适应。与基于百分回编码的现有方法相反,ARDMs不仅在完整的数据集上,而且在压缩单一数据点上获得令人信服的结果。此外,这可以使用微量的网络生成模型。