Diffusion models (DMs) have made significant progress in the fields of image, audio, and video generation. One downside of DMs is their slow iterative process. Recent algorithms for fast sampling are designed from the perspective of differential equations. However, in higher-order algorithms based on Taylor expansion, estimating the derivative of the score function becomes intractable due to the complexity of large-scale, well-trained neural networks. Driven by this motivation, in this work, we introduce the recursive difference (RD) method to calculate the derivative of the score function in the realm of DMs. Based on the RD method and the truncated Taylor expansion of score-integrand, we propose SciRE-Solver with the convergence order guarantee for accelerating sampling of DMs. To further investigate the effectiveness of the RD method, we also propose a variant named SciREI-Solver based on the RD method and exponential integrator. Our proposed sampling algorithms with RD method attain state-of-the-art (SOTA) FIDs in comparison to existing training-free sampling algorithms, across both discrete-time and continuous-time pre-trained DMs, under various number of score function evaluations (NFE). Remarkably, SciRE-Solver using a small NFEs demonstrates promising potential to surpass the FID achieved by some pre-trained models in their original papers using no fewer than $1000$ NFEs. For example, we reach SOTA value of $2.40$ FID with $100$ NFE for continuous-time DM and of $3.15$ FID with $84$ NFE for discrete-time DM on CIFAR-10, as well as of $2.17$ (2.02) FID with $18$ (50) NFE for discrete-time DM on CelebA 64$\times$64.
翻译:暂无翻译