Distributed training across several quantum computers could significantly improve the training time and if we could share the learned model, not the data, it could potentially improve the data privacy as the training would happen where the data is located. However, to the best of our knowledge, no work has been done in quantum machine learning (QML) in federation setting yet. In this work, we present the federated training on hybrid quantum-classical machine learning models although our framework could be generalized to pure quantum machine learning model. Specifically, we consider the quantum neural network (QNN) coupled with classical pre-trained convolutional model. Our distributed federated learning scheme demonstrated almost the same level of trained model accuracies and yet significantly faster distributed training. It demonstrates a promising future research direction for scaling and privacy aspects.
翻译:一些量子计算机的分散培训可以大大改善培训时间,如果我们能够分享所学模型,而不是数据,它有可能改进数据隐私,因为培训将发生在数据所在地;然而,据我们所知,联邦环境中的量子机器学习尚未完成。在这项工作中,我们介绍了混合量子经典机器学习模型的联合会培训,尽管我们的框架可以普遍化为纯量子机器学习模型。具体地说,我们认为量子神经网络(QNN)加上传统的预先培训的革命模型。我们分布式联合学习计划展示了几乎与经过培训的模型精度几乎相同的水平,但分布得更快。它展示了规模和隐私方面的有希望的未来研究方向。