Emerging edge intelligence applications require the server to continuously retrain and update deep neural networks deployed on remote edge nodes in order to leverage newly collected data samples. Unfortunately, it may be impossible in practice to continuously send fully updated weights to these edge nodes due to the highly constrained communication resource. In this paper, we propose the weight-wise deep partial updating paradigm, which smartly selects only a subset of weights to update at each server-to-edge communication round, while achieving a similar performance compared to full updating. Our method is established through analytically upper-bounding the loss difference between partial updating and full updating, and only updates the weights which make the largest contributions to the upper bound. Extensive experimental results demonstrate the efficacy of our partial updating methodology which achieves a high inference accuracy while updating a rather small number of weights.
翻译:新兴边缘情报应用要求服务器不断对远程边缘节点部署的深神经网络进行再培训和更新,以利用新收集的数据样本;不幸的是,由于通信资源极为有限,实际上可能无法向这些边缘节点不断发送完全更新的重量;在本文件中,我们提议了权重深深层次的部分更新模式,它明智地选择了每轮服务器对前沿通信更新的一组重量,同时实现了与全面更新的类似性能。我们的方法是通过分析上上层限制部分更新和全面更新之间的损失差异,而只是更新对上层贡献最大的重量。广泛的实验结果显示了我们部分更新方法的功效,该方法在更新数量相当少的重量的同时实现了高度的推断准确性。