Stein variational gradient descent (SVGD) is a general-purpose optimization-based sampling algorithm that has recently exploded in popularity, but is limited by two issues: it is known to produce biased samples, and it can be slow to converge on complicated distributions. A recently proposed stochastic variant of SVGD (sSVGD) addresses the first issue, producing unbiased samples by incorporating a special noise into the SVGD dynamics such that asymptotic convergence is guaranteed. Meanwhile, Stein variational Newton (SVN), a Newton-like extension of SVGD, dramatically accelerates the convergence of SVGD by incorporating Hessian information into the dynamics, but also produces biased samples. In this paper we derive, and provide a practical implementation of, a stochastic variant of SVN (sSVN) which is both asymptotically correct and converges rapidly. We demonstrate the effectiveness of our algorithm on a difficult class of test problems -- the Hybrid Rosenbrock density -- and show that sSVN converges using three orders of magnitude fewer gradient evaluations of the log likelihood than its stochastic SVGD counterpart. Our results show that sSVN is a promising approach to accelerating high-precision Bayesian inference tasks with modest-dimension, $d\sim\mathcal{O}(10)$.
翻译:SVGD (SSVGD) 最新提出的SVGD(SSVGD) 的随机变异变异模型解决了第一个问题,通过将特殊噪音纳入SVGD的动态中,从而保证了血压趋同,产生了公正的样本。与此同时, Stein变异牛顿(SVGD) (SVN) 是SVGD(SVN) 类似牛顿(SVN)的延伸,通过将Hessian信息纳入动态,大大加快SVGD的趋同速度,但也会产生偏差样本。 在本文中,我们提出并实际实施SVGD(SVGD) 的随机变异变异变模型,该变异模型既无症状,又迅速趋同。我们展示了我们对困难的测试类别 -- -- 混合 Rosenbrock密度 -- -- 的算法的有效性,并显示SSVN使用三级低度梯度变异度的三级对日志概率评估,显示SVGD-Astochnical-cal-sal assal exal assal-sal assal-sal-sal assion agevaus laus laus lax laxx lax laxs-s-sal-sal-sal-s-s-s- laxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx