This paper presents NCOTA-DGD, a Decentralized Gradient Descent (DGD) algorithm that combines local gradient descent with Non-Coherent Over-The-Air (NCOTA) consensus at the receivers to solve distributed machine-learning problems over wirelessly-connected systems. NCOTA-DGD leverages the waveform superposition properties of the wireless channels: it enables simultaneous transmissions under half-duplex constraints, by mapping local signals to a mixture of preamble sequences, and consensus via non-coherent combining at the receivers. NCOTA-DGD operates without channel state information and leverages the average channel pathloss to mix signals, without explicit knowledge of the mixing weights (typically known in consensus-based optimization algorithms). It is shown both theoretically and numerically that, for smooth and strongly-convex problems with fixed consensus and learning stepsizes, the updates of NCOTA-DGD converge (in Euclidean distance) to the global optimum with rate $\mathcal O(K^{-1/4})$ for a target number of iterations $K$. NCOTA-DGD is evaluated numerically over a logistic regression problem, showing faster convergence vis-\`a-vis running time than implementations of the classical DGD algorithm over digital and analog orthogonal channels.
翻译:本文展示了NCCOTA-DGD(NCOTA-DGD),这是将地方梯度下降与非一致性超高Air(NCOTA)在接收器上的共识相结合的分散式机器学习问题解决无线连接系统。 NCOTA-DGD利用无线频道的波形叠加特性:通过绘制本地信号与序言序列混合的地图,并通过接收器不协调地结合,使本地信号同步传输。 NCOTA-DGD(DGD)运行时没有频道国家信息,并且利用平均频道路标数混合信号,而没有明确了解混合重量(通常是在基于共识的优化算法中知道的)。 从理论上和数字上看,由于固定的共识和学习步骤的平稳和高度混集问题,NCOTA-DD(Euclideidean 距离)的更新到全球最佳的频率($\mathcal O(K ⁇ -1/4}),用于一个指标数的频率(K$KOGAT-GD),比正统级GA-GDUDGUD的递越快化轨道的递越显示一个数字问题。