High-dimensional partial differential equations (PDEs) are ubiquitous in economics, science and engineering. However, their numerical treatment poses formidable challenges since traditional grid-based methods tend to be frustrated by the curse of dimensionality. In this paper, we argue that tensor trains provide an appealing approximation framework for parabolic PDEs: the combination of reformulations in terms of backward stochastic differential equations and regression-type methods in the tensor format holds the promise of leveraging latent low-rank structures enabling both compression and efficient computation. Following this paradigm, we develop novel iterative schemes, involving either explicit and fast or implicit and accurate updates. We demonstrate in a number of examples that our methods achieve a favorable trade-off between accuracy and computational efficiency in comparison with state-of-the-art neural network based approaches.
翻译:高维部分差异方程式(PDEs)在经济学、科学和工程学方面无处不在。然而,其数字处理带来了巨大的挑战,因为传统的基于网格的方法往往会因维度的诅咒而受挫。在本文中,我们争论说,高压列车为抛物面PDEs提供了一个有吸引力的近似框架:从后向随机差异方程式和回溯型方法的组合来看,具有利用潜伏的低级结构进行压缩和高效计算的潜力。遵循这一模式,我们制定了新的迭代方案,涉及明确、快速或隐含和准确的更新。我们在若干例子中表明,我们的方法在精确度和计算效率之间实现了有利的权衡,与以最新工艺神经网络为基础的方法相比。