Federated learning is a distributed machine learning method in which a single server and multiple clients collaboratively build machine learning models without sharing datasets on clients. Numerous methods have been proposed to cope with the data heterogeneity issue in federated learning. Existing solutions require a model architecture tuned by the central server, yet a major technical challenge is that it is difficult to tune the model architecture due to the absence of local data on the central server. In this paper, we propose Federated learning via Model exchange (FedMe), which personalizes models with automatic model architecture tuning during the learning process. The novelty of FedMe lies in its learning process: clients exchange their models for model architecture tuning and model training. First, to optimize the model architectures for local data, clients tune their own personalized models by comparing to exchanged models and picking the one that yields the best performance. Second, clients train both personalized models and exchanged models by using deep mutual learning, in spite of different model architectures across the clients. We perform experiments on three real datasets and show that FedMe outperforms state-of-the-art federated learning methods while tuning model architectures automatically.
翻译:联邦学习是一种分布式的机器学习方法,一个单一服务器和多个客户在不分享客户数据集的情况下合作建立机器学习模型。提出了许多方法,以应对联邦学习中的数据异质问题。现有的解决方案需要一个由中央服务器调整的模型结构,但一项重大的技术挑战是,由于中央服务器上缺乏当地数据,模型结构很难调整。在本文件中,我们提议通过模型交换(FedMe)来将模型与自动模型结构结构调整进行个性化学习。美联储的新颖之处在于其学习过程:客户交换模型结构调整和模型培训。首先,优化本地数据模型结构,客户通过比较交换模型和选择产生最佳性能的模型来调整自己的个性化模型。第二,客户通过深入的相互学习来培训个性化模型和交换模型,尽管客户之间有不同的模型结构。我们在三个实际数据集上进行实验,并显示FedMefMe在自动调整模型结构的同时,同时进行自动调整模型结构。