In this work, we present formulations for regularized Kullback-Leibler and R\'enyi divergences via the Alpha Log-Determinant (Log-Det) divergences between positive Hilbert-Schmidt operators on Hilbert spaces in two different settings, namely (i) covariance operators and Gaussian measures defined on reproducing kernel Hilbert spaces (RKHS); and (ii) Gaussian processes with squared integrable sample paths. For characteristic kernels, the first setting leads to divergences between arbitrary Borel probability measures on a complete, separable metric space. We show that the Alpha Log-Det divergences are continuous in the Hilbert-Schmidt norm, which enables us to apply laws of large numbers for Hilbert space-valued random variables. As a consequence of this, we show that, in both settings, the infinite-dimensional divergences can be consistently and efficiently estimated from their finite-dimensional versions, using finite-dimensional Gram matrices/Gaussian measures and finite sample data, with {\it dimension-independent} sample complexities in all cases. RKHS methodology plays a central role in the theoretical analysis in both settings. The mathematical formulation is illustrated by numerical experiments.
翻译:在这项工作中,我们通过Alpha Log-Determinant(Log-Det),提出在Hilbert 空间的积极的Hilbert-Schmidt操作员在两种不同环境中的正Hilbert-Schmidt操作员之间的差异,即(一) 共变操作员和Gaussian 措施在复制内核希尔伯特空间(RKHS)时界定的常数操作员和高森措施;以及(二) 带有平面内核采样路径的高斯进程。对于特殊核心内核,第一种设定导致任意的波雷尔概率测量措施之间在完整和分立度空间(Log-Determinant)上的差异。我们表明,在Hilbert-Schmidt标准规范中,阿尔法Lafa Log-Ded 差异是持续的,这使我们能够对Hilbert 空间估值随机变量应用大数量法。结果显示,在这两种环境中,无限差异可以从其有限的多元版本中,使用有限的格/Gmlmex/Gsaussian 和有限的抽样数据数据数据数据数据数据,在核心的模型中都具有解释的复杂性的作用。