As easy-to-use deep learning libraries such as Tensorflow and Pytorch are popular, it has become convenient to develop machine learning models. Due to privacy issues with centralized machine learning, recently, federated learning in the distributed computing framework is attracting attention. The central server does not collect sensitive and personal data from clients in federated learning, but it only aggregates the model parameters. Though federated learning helps protect privacy, it is difficult for machine learning developers to share the models that they could utilize for different-domain applications. In this paper, we propose a federated learning model sharing service named Federated Learning Hub (FLHub). Users can upload, download, and contribute the model developed by other developers similarly to GitHub. We demonstrate that a forked model can finish training faster than the existing model and that learning progressed more quickly for each federated round.
翻译:由于Tensorflow和Pytorch等易于使用的深层学习图书馆很受欢迎,因此开发机器学习模式变得十分方便。由于中央机械学习的隐私问题,最近,在分布式计算框架中的联合会式学习引起了人们的注意。中央服务器不从联合学习的客户那里收集敏感和个人数据,而只是汇总模型参数。尽管联合学习有助于保护隐私,但机器学习开发者很难分享他们可用于不同领域应用的模式。在本文中,我们提议建立一个名为联邦学习中心(FLFHHub)的联合会式学习共享服务。用户可以上传、下载和贡献其他开发者开发的类似GitHub的模型。我们证明,一个轮式模型可以比现有模型更快地完成培训,而且每个交热周期的学习进展更快。