We consider over-the-air convex optimization on a $d-$dimensional space where coded gradients are sent over an additive Gaussian noise channel with variance $\sigma^2$. The codewords satisfy an average power constraint $P$, resulting in the signal-to-noise ratio (SNR) of $P/\sigma^2$. We derive bounds for the convergence rates for over-the-air optimization. Our first result is a lower bound for the convergence rate showing that any code must slowdown the convergence rate by a factor of roughly $\sqrt{d/\log(1+\mathtt{SNR})}$. Next, we consider a popular class of schemes called $analog$ $coding$, where a linear function of the gradient is sent. We show that a simple scaled transmission analog coding scheme results in a slowdown in convergence rate by a factor of $\sqrt{d(1+1/\mathtt{SNR})}$. This matches the previous lower bound up to constant factors for low SNR, making the scaled transmission scheme optimal at low SNR. However, we show that this slowdown is necessary for any analog coding scheme. In particular, a slowdown in convergence by a factor of $\sqrt{d}$ for analog coding remains even when SNR tends to infinity. Remarkably, we present a simple quantize-and-modulate scheme that uses $Amplitude$ $Shift$ $Keying$ and almost attains the optimal convergence rate at all SNRs.
翻译:我们考虑在一个以美元为单位的空间上超空 Convex 优化, 编码梯度通过一个添加式高斯噪声频道发送, 以美元计==2美元。 代码字符合平均功率限制 $P$, 导致信号对噪音比率为$P/\gma=2美元。 我们为超空优化的趋同率设定了界限。 我们的第一个结果是, 趋同率下限, 显示任何代码都必须以大约$=qrt{d/ d/\log (1 ⁇ matht{SNR}} $的系数来减缓趋同率。 接下来, 我们考虑的是, 一种叫$$analog $ $ 的流行计划类别, 以发送梯度的线性函数为美元。 我们显示, 简单的传输模拟调和调和率的趋同率以美元为单位, 以美元为单位的递减速率为单位。 在SNRUR中, 最优的递减率以美元为单位, 最优的Sqral 。