In this work, we investigate the universal representation capacity of the Matrix Product States (MPS) from the perspective of boolean functions and continuous functions. We show that MPS can accurately realize arbitrary boolean functions by providing a construction method of the corresponding MPS structure for an arbitrarily given boolean gate. Moreover, we prove that the function space of MPS with the scale-invariant sigmoidal activation is dense in the space of continuous functions defined on a compact subspace of the $n$-dimensional real coordinate space $\mathbb{R^{n}}$. We study the relation between MPS and neural networks and show that the MPS with a scale-invariant sigmoidal function is equivalent to a one-hidden-layer neural network equipped with a kernel function. We construct the equivalent neural networks for several specific MPS models and show that non-linear kernels such as the polynomial kernel which introduces the couplings between different components of the input into the model appear naturally in the equivalent neural networks. At last, we discuss the realization of the Gaussian Process (GP) with infinitely wide MPS by studying their equivalent neural networks.
翻译:在这项工作中,我们从布林功能和连续功能的角度来调查矩阵产品国的普遍代表性能力。我们表明,组合产品国可以通过为任意给定布林门提供相应的MPS结构构造方法,为任意给定布林门提供相应的MPS结构的构建方法,从而准确地实现任意布林功能。此外,我们证明,具有规模变化性吸附激活作用的MPS的功能空间在美元-维实际协调空间$mathbb{R ⁇ n ⁇ $的紧凑子空间所定义的连续功能空间中十分密集。我们研究了组合产品国与神经网络之间的关系,并表明具有规模不变化的模拟网络功能的MPS相当于一个配备了单层内核功能的单层神经网络。我们为几个特定的MPS模型建造了等等等等的神经网络,并表明非线性内核内核的内核,例如将输入到模型的不同组成部分之间的联动,自然出现在同等的神经网络中。最后,我们讨论以等量的方式研究高层系统网络的实现情况。