We consider the problem of reducing the dimensions of parameters and data in non-Gaussian Bayesian inference problems. Our goal is to identify an "informed" subspace of the parameters and an "informative" subspace of the data so that a high-dimensional inference problem can be approximately reformulated in low-to-moderate dimensions, thereby improving the computational efficiency of many inference techniques. To do so, we exploit gradient evaluations of the log-likelihood function. Furthermore, we use an information-theoretic analysis to derive a bound on the posterior error due to parameter and data dimension reduction. This bound relies on logarithmic Sobolev inequalities, and it reveals the appropriate dimensions of the reduced variables. We compare our method with classical dimension reduction techniques, such as principal component analysis and canonical correlation analysis, on applications ranging from mechanics to image processing.
翻译:我们考虑如何减少非高加索-巴伊西亚人推论中参数和数据的维度问题。我们的目标是确定参数的“知情”子空间和数据“信息”子空间,以便高维推论问题能够在低度到中度的维度方面大致重新出现,从而提高许多推论技术的计算效率。为此,我们利用对日志相似函数的梯度评估。此外,我们利用信息理论分析来根据参数和数据维度的减少,得出后方误差的界限。这取决于对数索博列夫的不平等,并揭示了减少的变量的适当层面。我们比较了我们的方法与典型的减少维度技术,例如主要组成部分分析以及从机械到图像处理等应用的共性相关性分析。