Federated learning is a machine learning approach in which data is not aggregated on a server, but is trained at clients locally, in consideration of security and privacy. ResNet is a classic but representative neural network that succeeds in deepening the neural network by learning a residual function that adds the inputs and outputs together. In federated learning, communication is performed between the server and clients to exchange weight parameters. Since ResNet has deep layers and a large number of parameters, the communication size becomes large. In this paper, we use Neural ODE as a lightweight model of ResNet to reduce communication size in federated learning. In addition, we newly introduce a flexible federated learning using Neural ODE models with different number of iterations, which correspond to ResNet models with different depths. Evaluation results using CIFAR-10 dataset show that the use of Neural ODE reduces communication size by up to 92.4% compared to ResNet. We also show that the proposed flexible federated learning can merge models with different iteration counts or depths.
翻译:联邦学习是一种机器学习方法,其中数据不是在服务器上汇总,而是在当地客户中考虑到安全和隐私问题接受培训。ResNet是一个经典但有代表性的神经网络,通过学习一个附加投入和产出的剩余功能,成功深化神经网络。在联合学习中,服务器和客户之间进行通信,以交换重量参数。由于ResNet具有深层和大量参数,通信规模变得很大。在本文中,我们使用Neural DE作为ResNet的轻量级模型,以减少联合学习中的通信规模。此外,我们新引入了一种灵活的联合学习模式,使用具有不同迭代数的Neal DE模型,与ResNet不同深度的ResNet模型相对应。使用CIFAR-10数据集的评估结果显示,使用Neural MODE将通信规模比ResNet减少92.4%。我们还表明,拟议的灵活节化学习可以将模型与不同的迭代数或深度合并。</s>