Emerging edge intelligence applications require the server to retrain and update deep neural networks deployed on remote edge nodes to leverage newly collected data samples. Unfortunately, it may be impossible in practice to continuously send fully updated weights to these edge nodes due to the highly constrained communication resource. In this paper, we propose the weight-wise deep partial updating paradigm, which smartly selects a small subset of weights to update in each server-to-edge communication round, while achieving a similar performance compared to full updating. Our method is established through analytically upper-bounding the loss difference between partial updating and full updating, and only updates the weights which make the largest contributions to the upper bound. Extensive experimental results demonstrate the efficacy of our partial updating methodology which achieves a high inference accuracy while updating a rather small number of weights.
翻译:新兴边缘情报应用要求服务器重新培训和更新部署在偏远边缘节点上的深神经网络,以利用新收集的数据样本;不幸的是,由于通信资源高度紧张,在实践中可能无法不断向这些边缘节点发送全面更新的权重;在本文件中,我们提议了权重的深度部分更新模式,它明智地选择了一小组权重来在每个服务器对前沿通信回合中更新,同时取得与全面更新相似的性能;我们的方法是通过分析性地将部分更新和全面更新之间的损失差异上限化来确定,而只是更新对上界贡献最大的权重。广泛的实验结果显示了我们部分更新方法的有效性,该方法在更新数量相当小的权重的同时实现了高度的准确性。