Kernel mean embeddings, a widely used technique in machine learning, map probability distributions to elements of a reproducing kernel Hilbert space (RKHS). For supervised learning problems, where input-output pairs are observed, the conditional distribution of outputs given the inputs is a key object. The input dependent conditional distribution of an output can be encoded with an RKHS valued function, the conditional kernel mean map. In this paper we present a new recursive algorithm to estimate the conditional kernel mean map in a Hilbert space valued $L_2$ space, that is in a Bochner space. We prove the weak and strong $L_2$ consistency of our recursive estimator under mild conditions. The idea is to generalize Stone's theorem for Hilbert space valued regression in a locally compact Polish space. We present new insights about conditional kernel mean embeddings and give strong asymptotic bounds regarding the convergence of the proposed recursive method. Finally, the results are demonstrated on three application domains: for inputs coming from Euclidean spaces, Riemannian manifolds and locally compact subsets of function spaces.
翻译:内核嵌入中嵌入器,这是机器学习中广泛使用的一种技术, 映射概率分布到复制内核Hilbert空间( RKHS) 的元素中。 对于受监督的学习问题, 当观察到输入- 输出对等时, 有条件的输出分配是一个关键对象。 输入附带条件的输出分布可以与 RKHS 值的功能( 有条件的内核平均图) 编码。 在本文中, 我们提出了一个新的循环算法, 来估计在Hilbert 空间中价值为$L_ 2美元的有条件内核平均地图, 空间在Bochner 空间中。 我们证明, 在温和的条件下, 我们的递递归性天天顶天顶天顶天顶的连续功能中, 有三个应用域显示结果: 来自 Euclidea 空间、 Riemannian 方块和本地压缩的集层功能。