Recent developments in quantum computing and machine learning have propelled the interdisciplinary study of quantum machine learning. Sequential modeling is an important task with high scientific and commercial value. Existing VQC or QNN-based methods require significant computational resources to perform the gradient-based optimization of a larger number of quantum circuit parameters. The major drawback is that such quantum gradient calculation requires a large amount of circuit evaluation, posing challenges in current near-term quantum hardware and simulation software. In this work, we approach sequential modeling by applying a reservoir computing (RC) framework to quantum recurrent neural networks (QRNN-RC) that are based on classical RNN, LSTM and GRU. The main idea to this RC approach is that the QRNN with randomly initialized weights is treated as a dynamical system and only the final classical linear layer is trained. Our numerical simulations show that the QRNN-RC can reach results comparable to fully trained QRNN models for several function approximation and time series prediction tasks. Since the QRNN training complexity is significantly reduced, the proposed model trains notably faster. In this work we also compare to corresponding classical RNN-based RC implementations and show that the quantum version learns faster by requiring fewer training epochs in most cases. Our results demonstrate a new possibility to utilize quantum neural network for sequential modeling with greater quantum hardware efficiency, an important design consideration for noisy intermediate-scale quantum (NISQ) computers.
翻译:量子计算和机器学习方面的近期发展推动了量子机器学习的跨学科研究; 序列建模是一项重要的任务,具有较高的科学和商业价值。 现有的VQC或QNNN 方法需要大量的计算资源,才能对数量更多的量子电路参数进行基于梯度的优化。 主要的缺点是,量子梯度计算需要大量的电路评估,对当前的近期量子硬件和模拟软件构成挑战。 在这项工作中,我们通过对基于经典RNN、LSTM和GRU的量子经常性神经网络(QRNN-RC)应用储油计算(RC)框架进行顺序建模,对基于经典RNNN、LSTM和GRU的经常神经网络(QRNNN-RC-RC-RC)进行量级建模,对基于随机型计算机的QRNNN(Q)中量计算机(QRNN)和NRRR(G)级计算机(QRNR)的中值计算机(QNER-N)进行快速化研究,我们用较快的硬的硬的硬的硬的硬体研究,对QRNNC(C)网络进行更快速的计算。