Variational Bayesian inference is an important machine-learning tool that finds application from statistics to robotics. The goal is to find an approximate probability density function (PDF) from a chosen family that is in some sense 'closest' to the full Bayesian posterior. Closeness is typically defined through the selection of an appropriate loss functional such as the Kullback-Leibler (KL) divergence. In this paper, we explore a new formulation of variational inference by exploiting the fact that (most) PDFs are members of a Bayesian Hilbert space under careful definitions of vector addition, scalar multiplication and an inner product. We show that variational inference based on KL divergence then amounts to an iterative projection, in the Euclidean sense, of the Bayesian posterior onto a subspace corresponding to the selected approximation family. We work through the details of this general framework for the specific case of the Gaussian approximation family and show the equivalence to another Gaussian variational inference approach. We furthermore discuss the implications for systems that exhibit sparsity, which is handled naturally in Bayesian space, and give an example of a high-dimensional robotic state estimation problem that can be handled as a result. Finally, we provide some preliminary examples of how the approach could be applied to non-Gaussian inference.
翻译:Bayesian 变异的推论是一个重要的机器学习工具,它从统计数据中找到应用到机器人。 目标是从一个在某种意义上“ loosest” 至整个 Bayesian 子宫的选定家庭找到大约概率密度函数( PDF ) 。 通常通过选择适当的损失函数, 诸如 Kullback- Leibertr (KL) 差异, 来界定距离距离。 在本文中, 我们探索一种新的变异推论公式, 利用( 多数) PDF 是Bayesian Hilbert 空间的成员, 其定义是矢量添加、 斜度乘法和内产的谨慎定义。 我们进一步讨论基于 KL 差异的变异推论, 然后在 Euclidean 意义上, 将Bayesian 子宫的变异推到一个与选定近距离家族相对的亚空间空间。 我们通过这个总体框架来研究, 来展示另一个高度变异法方法的等值方法。 我们进一步讨论了基于 KL的系统的影响, 如何将空间的高度分析作为我们自然处理的模型的模型, 。