We study the problem of estimating the score function using both implicit score matching and denoising score matching. Assuming that the data distribution exhibiting a low-dimensional structure, we prove that implicit score matching is able not only to adapt to the intrinsic dimension, but also to achieve the same rates of convergence as denoising score matching in terms of the sample size. Furthermore, we demonstrate that both methods allow us to estimate log-density Hessians without the curse of dimensionality by simple differentiation. This justifies convergence of ODE-based samplers for generative diffusion models. Our approach is based on Gagliardo-Nirenberg-type inequalities relating weighted $L^2$-norms of smooth functions and their derivatives.
翻译:本研究探讨了结合隐式分数匹配与去噪分数匹配的分数函数估计问题。在数据分布呈现低维结构的假设下,我们证明隐式分数匹配不仅能适应本征维度,还能在样本量意义上达到与去噪分数匹配相同的收敛速率。此外,我们通过理论分析表明,两种方法均可通过简单微分实现无维度灾难的对数密度Hessian矩阵估计,这为基于常微分方程的生成扩散模型采样器收敛性提供了理论依据。我们的证明方法基于关联光滑函数加权$L^2$范数与其导数的Gagliardo-Nirenberg型不等式。