Machine learning using quantum convolutional neural networks (QCNNs) has demonstrated success in both quantum and classical data classification. In previous studies, QCNNs attained a higher classification accuracy than their classical counterparts under the same training conditions in the few-parameter regime. However, the general performance of large-scale quantum models is difficult to examine because of the limited size of quantum circuits, which can be reliably implemented in the near future. We propose transfer learning as an effective strategy for utilizing small QCNNs in the noisy intermediate-scale quantum era to the full extent. In the classical-to-quantum transfer learning framework, a QCNN can solve complex classification problems without requiring a large-scale quantum circuit by utilizing a pre-trained classical convolutional neural network (CNN). We perform numerical simulations of QCNN models with various sets of quantum convolution and pooling operations for MNIST data classification under transfer learning, in which a classical CNN is trained with Fashion-MNIST data. The results show that transfer learning from classical to quantum CNN performs considerably better than purely classical transfer learning models under similar training conditions.
翻译:利用量子革命神经网络的机器学习在量子和古典数据分类方面都取得了成功。在以前的研究中,QCNN在几个参数制度中,在相同的培训条件下,其分类准确性高于其古典对应人员。然而,由于量子电路规模有限,在不远的将来可以可靠地实施量子电路,大型量子模型的一般性能难以检查。我们提议将转移学习作为在吵闹的中间量子时代全面利用小量子网络的有效战略。在古典到夸大转移学习框架中,QCNN可以使用预先培训的古典神经网络(CNN),解决复杂的分类问题,而不需要大型量子电路。我们用各种量子相配制的量子电波和集合操作模型进行数字模拟,用于在传输学习中进行MNIST数据分类,对古典CNN进行了关于Fashion-MNIST数据的培训。结果显示,在类似培训条件下,从古典到量子CNNN的转换比纯古典转移学习模型要好得多。