We propose a projected Wasserstein gradient descent method (pWGD) for high-dimensional Bayesian inference problems. The underlying density function of a particle system of WGD is approximated by kernel density estimation (KDE), which faces the long-standing curse of dimensionality. We overcome this challenge by exploiting the intrinsic low-rank structure in the difference between the posterior and prior distributions. The parameters are projected into a low-dimensional subspace to alleviate the approximation error of KDE in high dimensions. We formulate a projected Wasserstein gradient flow and analyze its convergence property under mild assumptions. Several numerical experiments illustrate the accuracy, convergence, and complexity scalability of pWGD with respect to parameter dimension, sample size, and processor cores.
翻译:我们建议对高维贝叶斯推论问题采用预测的瓦塞尔斯坦梯度梯度下降法(pWGD) 。 WGD粒子系统的潜在密度功能被内核密度估计(KDE)所近似,因为内核密度估计面临长期的维度的诅咒。我们通过利用后方和先前分布之间的差异的内在低等级结构克服了这一挑战。这些参数被投射到一个低维次空间,以缓解 KDE在高维度上的近似误差。我们根据轻度假设制定了预测的瓦塞尔斯坦梯度流并分析其趋同特性。一些数字实验显示了PWGD在参数尺寸、样本大小和处理器核心方面的精确性、趋同性和复杂性。