With the constant increase of the number of quantum bits (qubits) in the actual quantum computers, implementing and accelerating the prevalent deep learning on quantum computers are becoming possible. Along with this trend, there emerge quantum neural architectures based on different designs of quantum neurons. A fundamental question in quantum deep learning arises: what is the best quantum neural architecture? Inspired by the design of neural architectures for classical computing which typically employs multiple types of neurons, this paper makes the very first attempt to mix quantum neuron designs to build quantum neural architectures. We observe that the existing quantum neuron designs may be quite different but complementary, such as neurons from variation quantum circuits (VQC) and Quantumflow. More specifically, VQC can apply real-valued weights but suffer from being extended to multiple layers, while QuantumFlow can build a multi-layer network efficiently, but is limited to use binary weights. To take their respective advantages, we propose to mix them together and figure out a way to connect them seamlessly without additional costly measurement. We further investigate the design principles to mix quantum neurons, which can provide guidance for quantum neural architecture exploration in the future. Experimental results demonstrate that the identified quantum neural architectures with mixed quantum neurons can achieve 90.62% of accuracy on the MNIST dataset, compared with 52.77% and 69.92% on the VQC and QuantumFlow, respectively.
翻译:随着实际量子计算机量子比特(qubits)数量不断增加,实施和加速量子计算机普遍深入学习的量子比特(qubits)数量在不断增长,正在成为可能。随着这一趋势,出现了基于量子神经设计的不同设计的量子神经结构。量子深学习中出现一个根本问题:什么是最佳量子神经结构?受古典计算神经结构设计(通常使用多种类型的神经元)的启发,本文首次尝试混合量子神经设计以构建量子神经结构。我们观察到,现有的量子神经神经设计可能非常不同,但相互补充,例如来自变化量子电路(VQC)和量子流的神经系统。更具体地说,VQQC可以应用实际价值的重量,但会因扩展到多个层次而受到影响。由于QuantumFlow可以高效率地建立多层网络,但仅限于使用二进制重量。为了发挥各自的优势,我们建议把它们混合起来,并找出一种在不增加成本测量的情况下将它们连接起来的方法。我们进一步调查设计原理,将数量神经神经设计原则与量电流(VQ-62)的神经结构进行对比,在实验中可以提供量级结构中测量测量中测量中, 。C能够实现量结构的测量结果。