Due to the linearity of quantum mechanics, it remains a challenge to design quantum generative machine learning models that embed non-linear activations into the evolution of the statevector. However, some of the most successful classical generative models, such as those based on neural networks, involve highly non-linear dynamics for quality training. In this paper, we explore the effect of these dynamics in quantum generative modeling by introducing a model that adds non-linear activations via a neural network structure onto the standard Born Machine framework - the Quantum Neuron Born Machine (QNBM). To achieve this, we utilize a previously introduced Quantum Neuron subroutine, which is a repeat-until-success circuit with mid-circuit measurements and classical control. After introducing the QNBM, we investigate how its performance depends on network size, by training a 3-layer QNBM with 4 output neurons and various input and hidden layer sizes. We then compare our non-linear QNBM to the linear Quantum Circuit Born Machine (QCBM). We allocate similar time and memory resources to each model, such that the only major difference is the qubit overhead required by the QNBM. With gradient-based training, we show that while both models can easily learn a trivial uniform probability distribution, on a more challenging class of distributions, the QNBM achieves an almost 3x smaller error rate than a QCBM with a similar number of tunable parameters. We therefore provide evidence that suggests that non-linearity is a useful resource in quantum generative models, and we put forth the QNBM as a new model with good generative performance and potential for quantum advantage.
翻译:由于量子力学的直线性能,设计量子感化机器学习模型,将非线性启动嵌入状态变异中,这仍然是一项挑战。然而,一些最成功的古典基因模型,如神经网络模型,涉及质量培训的高度非线性动态。在本文中,我们探索这些动态在量性变异模型中的影响,方法是引入一个模型,通过神经网络结构将非线性启动添加到标准Born机器框架-Qautum Neuron Born机器(QNBM)上。为此,我们使用了以前推出的量子神经神经系统亚路程,这是中路测量和经典控制的中线性超成功电路。在引入QNBM后,我们调查其性能如何取决于网络规模,通过培训一个具有4个输出神经元和各种输入和隐藏层大小的3级QBMM,我们将非线性QBM比直线性能与直线性巡回机器(QBBM)相比,我们使用一个简单的时间和记忆模型,我们将一个相似的机型号排序和记忆性机型号都用来学习。