Federated learning (FL) is a computational paradigm that enables organizations to collaborate on machine learning (ML) projects without sharing sensitive data, such as, patient records, financial data, or classified secrets. Open Federated Learning (OpenFL https://github.com/intel/openfl) is an open-source framework for training ML algorithms using the data-private collaborative learning paradigm of FL. OpenFL works with training pipelines built with both TensorFlow and PyTorch, and can be easily extended to other ML and deep learning frameworks. Here, we summarize the motivation and development characteristics of OpenFL, with the intention of facilitating its application to existing ML model training in a production environment. Finally, we describe the first use of the OpenFL framework to train consensus ML models in a consortium of international healthcare organizations, as well as how it facilitates the first computational competition on FL.
翻译:联邦学习(FL)是一种计算模式,使各组织能够在不分享敏感数据(如病人记录、财务数据或机密)的情况下就机器学习(ML)项目进行合作。开放联邦学习(OpenFLL https://github.com/intel/openfl)是利用FL的数据-私营合作学习模式来培训ML算法的开放源框架。OpenFL与TensorFlow和PyTorch共同建立的培训管道合作,可以很容易地扩展到其他ML和深层学习框架。这里,我们总结OpenFL的动机和发展特点,目的是促进将其应用于现有的ML模型培训。最后,我们介绍了开放联邦学习框架首次在国际保健组织联合会中培训协商一致的ML模型,以及它如何促进首次计算FL的竞赛。