In this work, we study the Neural Tangent Kernel (NTK) of Matrix Product States (MPS) and the convergence of its NTK in the infinite bond dimensional limit. We prove that the NTK of MPS asymptotically converges to a constant matrix during the gradient descent (training) process (and also the initialization phase) as the bond dimensions of MPS go to infinity by the observation that the variation of the tensors in MPS asymptotically goes to zero during training in the infinite limit. By showing the positive-definiteness of the NTK of MPS, the convergence of MPS during the training in the function space (space of functions represented by MPS) is guaranteed without any extra assumptions of the data set. We then consider the settings of (supervised) Regression with Mean Square Error (RMSE) and (unsupervised) Born Machines (BM) and analyze their dynamics in the infinite bond dimensional limit. The ordinary differential equations (ODEs) which describe the dynamics of the responses of MPS in the RMSE and BM are derived and solved in the closed-form. For the Regression, we consider Mercer Kernels (Gaussian Kernels) and find that the evolution of the mean of the responses of MPS follows the largest eigenvalue of the NTK. Due to the orthogonality of the kernel functions in BM, the evolution of different modes (samples) decouples and the "characteristic time" of convergence in training is obtained.
翻译:在这项工作中,我们研究了MIT产品国(MPS)的Neal Tangent Kernel(NTK)及其NTK在无限债券维度限制中的趋同性。我们证明MPS的NTK在梯度下降(培训)过程(以及初始阶段)中几乎会与一个恒定矩阵相融合,因为人们发现MPS的变异性在无限债券维度的训练期间将MPS的变异性逐渐变为零。通过显示MPS的NTK的正确定性,MPS在功能空间(MPS所代表的功能空间)培训期间的趋同性会得到保证,而没有额外的数据集假设。我们随后会考虑(SMSE)和(不受监督的)原始机器(BM)的反向性,并分析其在无限债券维度限制中的动态。普通差异方程式(ODS)描述了MPS的动态,而MPS在功能空间(MPS所代表的功能空间空间(MPS)的趋同级变异性(KER的变异性)的变异性功能是“我们KREG的变式”的变式。我们的CRECRA的变的变的变和CRBMA的变的变的变的变和变。我们的变的变的变和变的变的变和变的变的变。