Partial Differential Equations (PDEs) are used to model a variety of dynamical systems in science and engineering. Recent advances in deep learning have enabled us to solve them in a higher dimension by addressing the curse of dimensionality in new ways. However, deep learning methods are constrained by training time and memory. To tackle these shortcomings, we implement Tensor Neural Networks (TNN), a quantum-inspired neural network architecture that leverages Tensor Network ideas to improve upon deep learning approaches. We demonstrate that TNN provide significant parameter savings while attaining the same accuracy as compared to the classical Dense Neural Network (DNN). In addition, we also show how TNN can be trained faster than DNN for the same accuracy. We benchmark TNN by applying them to solve parabolic PDEs, specifically the Black-Scholes-Barenblatt equation, widely used in financial pricing theory, empirically showing the advantages of TNN over DNN. Further examples, such as the Hamilton-Jacobi-Bellman equation, are also discussed.
翻译:部分差异化(PDEs)用于模拟各种科学和工程的动态系统。最近深层次学习的进展使我们能够通过以新的方式解决维度的诅咒,从更高的层面解决这些问题。然而,深层次的学习方法受到培训时间和记忆的限制。为了解决这些缺陷,我们实施了Tensor神经网络(TNN),这是一个量子驱动神经网络结构,利用Tensor网络的想法来改进深层次的学习方法。我们证明,TNN提供了显著的参数节省,而与古典的Dense Neural网络(DNN)的精确度相同。此外,我们还展示了TNNN如何比DNN更快地以同样的精确度培训。我们用TNNN(T)作为基准,用它们来解决抛物PDE,特别是广泛用于金融定价理论的黑-Scholes-Barenblatt方程式,实验性地展示TNN的优势。此外,例如汉密尔顿-Jacobi-Bellman方程式,也得到了讨论。