We provide a first finite-particle convergence rate for Stein variational gradient descent (SVGD). Specifically, whenever the target distribution is sub-Gaussian with a Lipschitz score, SVGD with n particles and an appropriate step size sequence drives the kernel Stein discrepancy to zero at an order 1/sqrt(log log n) rate. We suspect that the dependence on n can be improved, and we hope that our explicit, non-asymptotic proof strategy will serve as a template for future refinements.
翻译:具体地说,每当目标分布为具有利普施奇茨分数的亚加西安分数、带有N粒子和适当步数序列的SVGD将斯坦因内核差异按1/sqrt(logloglog n)的顺序推到零时,我们就会为斯坦因梯度下降提供第一个有限粒子聚合率(SVGD ) 。 我们怀疑对n的依赖性是可以改善的,我们希望我们明确的、非无药可耐的验证战略将成为未来改进的样板。