We establish a framework of random inverse problems with real-time observations over graphs, and present a decentralized online learning algorithm based on online data streams, which unifies the distributed parameter estimation in Hilbert space and the least mean square problem in reproducing kernel Hilbert space (RKHS-LMS). We transform the algorithm convergence into the asymptotic stability of randomly time-varying difference equations in Hilbert space with L2-bounded martingale difference terms and develop the L2 -asymptotic stability theory. It is shown that if the network graph is connected and the sequence of forward operators satisfies the infinitedimensional spatio-temporal persistence of excitation condition, then the estimates of all nodes are mean square and almost surely strongly consistent. By equivalently transferring the distributed learning problem in RKHS to the random inverse problem over graphs, we propose a decentralized online learning algorithm in RKHS based on non-stationary and non-independent online data streams, and prove that the algorithm is mean square and almost surely strongly consistent if the operators induced by the random input data satisfy the infinite-dimensional spatio-temporal persistence of excitation condition.
翻译:我们建立了随机图上实时观测的随机图反问题框架,并提出了一种基于在线数据流的分布式在线学习算法,该算法将希尔伯特空间中的分布参数估计和再生核希尔伯特空间中的最小均方问题(RKHS-LMS)统一起来。我们将算法的收敛性转化为带有L2有界鞅差分项的希尔伯特空间中随机时变差分方程的渐近稳定性,并开发了L2-渐近稳定性理论。结果显示,如果网络图连通,并且正向算子序列满足无限维度时空励磁条件,则所有节点的估计值均为均方且几乎确定一致的。通过将RKHS中的分布式学习问题等效地转换为随机图反问题,我们提出了一种基于非平稳且非独立在线数据流的RKHS分布式在线学习算法,并证明了如果由随机输入数据导出的算子满足无限维度时空励磁条件,则该算法是均方且几乎确定一致的。