Tensor network, which originates from quantum physics, is emerging as an efficient tool for classical and quantum machine learning. Nevertheless, there still exists a considerable accuracy gap between tensor network and the sophisticated neural network models for classical machine learning. In this work, we combine the ideas of matrix product state (MPS), the simplest tensor network structure, and residual neural network and propose the residual matrix product state (ResMPS). The ResMPS can be treated as a network where its layers map the "hidden" features to the outputs (e.g., classifications), and the variational parameters of the layers are the functions of the features of the samples (e.g., pixels of images). This is different from neural network, where the layers map feed-forwardly the features to the output. The ResMPS can equip with the non-linear activations and dropout layers, and outperforms the state-of-the-art tensor network models in terms of efficiency, stability, and expression power. Besides, ResMPS is interpretable from the perspective of polynomial expansion, where the factorization and exponential machines naturally emerge. Our work contributes to connecting and hybridizing neural and tensor networks, which is crucial to further enhance our understand of the working mechanisms and improve the performance of both models.
翻译:源自量子物理学的电线网络正在成为古典和量子机器学习的有效工具,然而,在古典和古典机器学习的先进网络和尖端神经网络模型之间仍然存在着相当大的准确性差。在这项工作中,我们结合了矩阵产品状态(MPS)、简单最华氏网络结构以及残余神经网络的想法,并提出了残余矩阵产品状态(ResMPS )。ResMPS可以被视为一个网络,其层图绘制“隐藏”特性与输出(例如分类)的“隐藏”特征,以及层的变异参数是样品(例如图像像素等)的功能。这与神经网络不同,因为层图向上传递了输出的特征。ResMPS可以配备非线性激活和退出层,在效率、稳定性和表达力方面超越了最新水平的索尔网络模型。此外,ResMPS还可以从多度模型扩展的功能功能(例如图像等)的功能上加以解释,在神经网络中,层图面图面图面图面图面图面与输出的特性是相向前向前方看的。 。 ResMPS 可以进一步理解我们的指数化和指数化, 我们的指数化和指数化工作机制的转化和指数化和指数化, 将使我们的功能与指数化和指数化和指数化的功能都更接近。