In this work, we consider the asynchronous distributed optimization problem in which each node has its own convex cost function and can communicate directly only with its neighbors, as determined by a directed communication topology (directed graph or digraph). First, we reformulate the optimization problem so that Alternating Direction Method of Multipliers (ADMM) can be utilized. Then, we propose an algorithm, herein called Asynchronous Approximate Distributed Alternating Direction Method of Multipliers (AsyAD-ADMM), using finite-time asynchronous approximate ratio consensus, to solve the multi-node convex optimization problem, in which every node performs iterative computations and exchanges information with its neighbors asynchronously. More specifically, at every iteration of AsyAD-ADMM, each node solves a local convex optimization problem for one of the primal variables and utilizes a finite-time asynchronous approximate consensus protocol to obtain the value of the other variable which is close to the optimal value, since the cost function for the second primal variable is not decomposable. If the individual cost functions are convex but not necessarily differentiable, the proposed algorithm converges at a rate of $\mathcal{O}(1/k)$, where $k$ is the iteration counter. The efficacy of AsyAD-ADMM is exemplified via a proof-of-concept distributed least-square optimization problem with different performance-influencing factors investigated.
翻译:在这项工作中,我们考虑的是非同步分布式优化问题,其中每个节点都有其自身的松动成本功能,并且只能与邻居直接沟通,这是由直接的通信地形学(方向图形或测算)决定的。首先,我们重新配置优化问题,这样可以使用乘数方向法(ADMM)的交替法。然后,我们提出一个算法,这里称为“不同步的近似分布式分配式优化方法(AsyAD-ADMM)”,使用固定时间的不同步近似比率共识,解决多节点的优化问题,解决多节点进行迭代计算并与邻居交流。更具体地说,在AsyAD-ADMMM(ADM)的每一次迭代方向法中,每个节点都能解决一个局部的松动优化问题,并使用固定时间的多点组合式协议(AsyAD-ADM(AD-ADM(ADM) ) 获得另一个接近于最优化值的变量的值,因为其成本函数是最低的缩数值,而缩缩算的缩算的缩算的缩缩缩算不是调的缩。