Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo. However, SVGD has been found to suffer from variance underestimation when the dimensionality of the target distribution is high. Recent developments have advocated projecting both the score function and the data onto real lines to sidestep this issue, although this can severely overestimate the epistemic (model) uncertainty. In this work, we propose Grassmann Stein variational gradient descent (GSVGD) as an alternative approach, which permits projections onto arbitrary dimensional subspaces. Compared with other variants of SVGD that rely on dimensionality reduction, GSVGD updates the projectors simultaneously for the score function and the data, and the optimal projectors are determined through a coupled Grassmann-valued diffusion process which explores favourable subspaces. Both our theoretical and experimental results suggest that GSVGD enjoys efficient state-space exploration in high-dimensional problems that have an intrinsic low-dimensional structure.
翻译:Stein 梯度梯度下沉(SVGD) 是一种确定性粒子梯度算法,它为Markov 链 Monte Carlo提供了有效的替代物。然而,SVGD发现,当目标分布的维度很高时,它会受到差异低估。最近的事态发展主张将得分函数和数据投射到真实的线上,以绕开这个问题,尽管这可能会严重地高估外观(模型)不确定性。在这项工作中,我们提议Grassmann Stein 梯度梯度下沉(GSVGD) 作为一种替代方法,允许对任意的维度子空间进行预测。与SVGD的其他变量相比,SVGD以降低维度为基础,GSVGD同时更新计分函数和数据,而最佳投影仪则通过一个同时的格拉斯曼估值的传播过程来确定,该过程探索有利的子空间。我们的理论和实验结果表明,GSVGDD在具有内在低维结构的高度问题中享有高效的状态空间探索。