Decentralized Gradient Descent (DGD) is a popular algorithm used to solve decentralized optimization problems in diverse domains such as remote sensing, distributed inference, multi-agent coordination, and federated learning. Yet, executing DGD over wireless systems affected by noise, fading and limited bandwidth presents challenges, requiring scheduling of transmissions to mitigate interference and the acquisition of topology and channel state information -- complex tasks in wireless decentralized systems. This paper proposes a DGD algorithm tailored to wireless systems. Unlike existing approaches, it operates without inter-agent coordination, topology information, or channel state information. Its core is a Non-Coherent Over-The-Air (NCOTA) consensus scheme, exploiting a noisy energy superposition property of wireless channels. With a randomized transmission strategy to accommodate half-duplex operation, transmitters map local optimization signals to energy levels across subcarriers in an OFDM frame, and transmit concurrently without coordination. It is shown that received energies form a noisy consensus signal, whose fluctuations are mitigated via a consensus stepsize. NCOTA-DGD leverages the channel pathloss for consensus formation, without explicit knowledge of the mixing weights. It is shown that, for the class of strongly-convex problems, the expected squared distance between the local and globally optimum models vanishes with rate $\mathcal O(1/\sqrt{k})$ after $k$ iterations, with a proper design of decreasing stepsizes. Extensions address a broad class of fading models and frequency-selective channels. Numerical results on an image classification task depict faster convergence vis-\`a-vis running time than state-of-the-art schemes, especially in densely deployed networks.
翻译:暂无翻译