Distributed linear algebraic equation over networks, where nodes hold a part of problem data and cooperatively solve the equation via node-to-node communications, is a basic distributed computation task receiving an increasing research attention. Communications over a network have a stochastic nature, with both temporal and spatial dependence due to link failures, packet dropouts or node recreation, etc. In this paper, we study the convergence and convergence rate of distributed linear equation protocols over a $\ast$-mixing random network, where the temporal and spatial dependencies between the node-to-node communications are allowed. When the network linear equation admits exact solutions, we prove the mean-squared exponential convergence rate of the distributed projection consensus algorithm, while the lower and upper bound estimations of the convergence rate are also given for independent and identically distributed (i.i.d.) random graphs. Motivated by the randomized Kaczmarz algorithm, we also propose a distributed randomized projection consensus algorithm, where each node randomly selects one row of local linear equations for projection per iteration, and establish an exponential convergence rate. When the network linear equation admits no exact solution, we prove that a distributed gradient-descent-like algorithm with diminishing step-sizes can drive all nodes' states to a least-squares solution at a sublinear rate. These results collectively illustrate that distributed computations may overcome communication correlations if the prototype algorithms enjoy certain contractive properties or are designed with suitable parameters.
翻译:在网络上分布的线性代数方程式中,节点持有问题数据的一部分,并通过节点到节点通信合作解决方程式,这是一个基本的分布式计算任务,引起越来越多的研究关注。网络上的通信具有一种随机性,由于连接失败、包装丢弃或节点娱乐等原因,网络上的通信具有时间和空间依赖性。在本文中,我们研究分布式线性方程协议的趋同率和趋同率,这种网络使用美元-ast$混合随机网络,允许节点到节点通信之间的时间和空间依赖性,这是一条基本的分布式计算任务。当网络线性方程式接受精确的解决方案时,我们证明分布式预测共识方程的平准指数趋同率,同时对合并率的下限和上限估计也用于独立和相同的分布(i.d.)随机图。受随机的卡茨马尔兹混合算法的驱动,我们还提议一种分布式随机的预测式预测方程式,即每个不随机选择一行的本地线性直方方方方程式,用于按分级计算,在分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级、分级