We consider the analysis of probability distributions through their associated covariance operators from reproducing kernel Hilbert spaces. We show that the von Neumann entropy and relative entropy of these operators are intimately related to the usual notions of Shannon entropy and relative entropy, and share many of their properties. They come together with efficient estimation algorithms from various oracles on the probability distributions. We also consider product spaces and show that for tensor product kernels, we can define notions of mutual information and joint entropies, which can then characterize independence perfectly, but only partially conditional independence. We finally show how these new notions of relative entropy lead to new upper-bounds on log partition functions, that can be used together with convex optimization within variational inference methods, providing a new family of probabilistic inference methods.
翻译:我们从复制内核Hilbert空间的相关共变操作器中考虑对概率分布的分析。 我们显示这些操作器的 von Neumann entropy 和 相对 entropy 和 相对 entropy 与 Shannon 的 entropy 和 相对 entropy 的 通常概念密切相关, 并分享其许多特性 。 它们结合了各种神器对概率分布的有效估计算法 。 我们还考虑产品空间, 并显示对于 Exmor 产品内核, 我们可以定义相互信息的概念和联合静脉, 这些概念可以完美地描述独立, 但只能部分地描述独立性。 我们最终展示了这些新的相对 entropy 概念是如何导致在日志分隔功能上新的上层的, 并且可以在变异推法方法中与对等值优化一起使用, 提供了一种有概率性推断方法的新组合 。