This paper presents NCOTA-DGD, a Decentralized Gradient Descent (DGD) algorithm that combines local gradient descent with a novel Non-Coherent Over-The-Air (NCOTA) consensus scheme to solve distributed machine-learning problems over wirelessly-connected systems. NCOTA-DGD leverages the waveform superposition properties of the wireless channels: it enables simultaneous transmissions under half-duplex constraints, by mapping local optimization signals to a mixture of preamble sequences, and consensus via non-coherent combining at the receivers. NCOTA-DGD operates without channel state information at transmitters and receivers, and leverages the average channel pathloss to mix signals, without explicit knowledge of the mixing weights (typically known in consensus-based optimization algorithms). It is shown both theoretically and numerically that, for smooth and strongly-convex problems with fixed consensus and learning stepsizes, the updates of NCOTA-DGD converge in Euclidean distance to the global optimum with rate $\mathcal O(K^{-1/4})$ for a target of $K$ iterations. NCOTA-DGD is evaluated numerically over a logistic regression problem, showing faster convergence vis-\`a-vis running time than implementations of the classical DGD algorithm over digital and analog orthogonal channels.
翻译:本文介绍了NCCOTA-DGD(NCOTA-DGD),这是将本地梯度下降与新颖的无一致性超过Air(NCOTA)协商一致计划相结合的分散式机器学习问题与无线连接的系统相结合的分散式机器学习问题。 NCOTA-DGD利用无线连接系统波形叠加特性:通过绘制本地优化信号与序言序列混合的地图,并通过接收器不协调的组合,使本地优化信号同步传输。 NCOTA-DD在发送器和接收器上没有频道状态信息,利用平均频道路透析与信号混合,而没有明确了解混合重量(通常在基于共识的优化算法中知道 ) 。 从理论上和数字上看,由于固定的共识和学习步骤,NCOTA-D的更新与Euclideidean距离一致,以美元/mathcal O(K ⁇ -1/4}的速度运行,以美元的速度连接信号,没有明确了解混合的混合体重量(通常优化度),而显示其正统化的正态分析分析过程问题。