With the prosperity of mobile devices, the distributed learning approach enabling model training with decentralized data has attracted wide research. However, the lack of training capability for edge devices significantly limits the energy efficiency of distributed learning in real life. This paper describes a novel approach of training DNNs exploiting the redundancy and the weight asymmetry potential of conventional backpropagation. We demonstrate that with negligible classification accuracy loss, the proposed approach outperforms the prior arts by 5x in terms of energy efficiency.
翻译:随着移动设备的繁荣,分散式学习方法的扶持性培训模式与分散式数据引起了广泛的研究,然而,边缘设备缺乏培训能力,严重限制了现实生活中分散式学习的能源效率,本文描述了一种新颖的培训方式,即培训DNN利用传统反向通信的冗余和重量不对称潜力,我们证明,由于分类准确性损失微不足道,拟议的方法在能源效率方面比以前的艺术高出5x。