Particle-based approximate Bayesian inference approaches such as Stein Variational Gradient Descent (SVGD) combine the flexibility and convergence guarantees of sampling methods with the computational benefits of variational inference. In practice, SVGD relies on the choice of an appropriate kernel function, which impacts its ability to model the target distribution -- a challenging problem with only heuristic solutions. We propose Neural Variational Gradient Descent (NVGD), which is based on parameterizing the witness function of the Stein discrepancy by a deep neural network whose parameters are learned in parallel to the inference, mitigating the necessity to make any kernel choices whatsoever. We empirically evaluate our method on popular synthetic inference problems, real-world Bayesian linear regression, and Bayesian neural network inference.
翻译:在实际操作中,SVGD依靠选择适当的内核功能,从而影响其模拟目标分布的能力 -- -- 这是一个具有挑战性的问题,只有超理论性的解决办法。我们提议神经多变梯度源(NVGD),其依据是一个深层神经网络对斯坦因差异的证人功能进行参数参数化的参数化,其参数在推断的同时学习,从而减轻作出任何内核选择的必要性。我们从经验上评估了我们关于流行合成推断问题、真实世界巴耶斯线性回归和巴伊西亚神经网络的推断的方法。