Relying on the classical connection between Backward Stochastic Differential Equations (BSDEs) and non-linear parabolic partial differential equations (PDEs), we propose a new probabilistic learning scheme for solving high-dimensional semi-linear parabolic PDEs. This scheme is inspired by the approach coming from machine learning and developed using deep neural networks in Han and al. [32]. Our algorithm is based on a Picard iteration scheme in which a sequence of linear-quadratic optimisation problem is solved by means of stochastic gradient descent (SGD) algorithm. In the framework of a linear specification of the approximation space, we manage to prove a convergence result for our scheme, under some smallness condition. In practice, in order to be able to treat high-dimensional examples, we employ sparse grid approximation spaces. In the case of periodic coefficients and using pre-wavelet basis functions, we obtain an upper bound on the global complexity of our method. It shows in particular that the curse of dimensionality is tamed in the sense that in order to achieve a root mean squared error of order ${\epsilon}$, for a prescribed precision ${\epsilon}$, the complexity of the Picard algorithm grows polynomially in ${\epsilon}^{-1}$ up to some logarithmic factor $ |log({\epsilon})| $ which grows linearly with respect to the PDE dimension. Various numerical results are presented to validate the performance of our method and to compare them with some recent machine learning schemes proposed in Han and al. [20] and Hur\'e and al. [37].
翻译:基于后向线性平面差异(BSDEs)和非线性平面部分偏差方程(PDEs)之间的经典联系,我们提出了一个解决高维半线性抛物面 PDEs 的新的概率学习计划。这个计划源于在汉和al. [32] 使用深层神经网络进行机器学习和开发的方法。我们的算法基于一个Picard 迭代方案,在这个方案中,线性平面优化问题通过随机梯度梯度下降(SGD)算法解决。在近似空间线性规格的框架内,我们设法证明我们的方案在某种小的状态下具有趋同效果。在实践上,为了能够处理高度实例,我们使用了稀薄的电网近距离空间。在定期系数和波前基础功能中,我们得到了我们方法全球复杂性的上限。它特别表明,从某种意义上的维度诅咒是,为了实现最近正值的平面平面平面平面平面平面平面计算结果, 将某种正方平面的正方平面的逻辑值升正方平面的逻辑值升正值运。