Backward stochastic differential equations (BSDEs) belong nowadays to the most frequently studied equations in stochastic analysis and computational stochastics. BSDEs in applications are often nonlinear and high-dimensional. In nearly all cases such nonlinear high-dimensional BSDEs cannot be solved explicitly and it has been and still is a very active topic of research to design and analyze numerical approximation methods to approximatively solve nonlinear high-dimensional BSDEs. Although there are a large number of research articles in the scientific literature which analyze numerical approximation methods for nonlinear BSDEs, until today there has been no numerical approximation method in the scientific literature which has been proven to overcome the curse of dimensionality in the numerical approximation of nonlinear BSDEs in the sense that the number of computational operations of the numerical approximation method to approximatively compute one sample path of the BSDE solution grows at most polynomially in both the reciprocal $1/ \varepsilon$ of the prescribed approximation accuracy $\varepsilon \in (0,\infty)$ and the dimension $d\in \mathbb N=\{1,2,3,\ldots\}$ of the BSDE. It is the key contribution of this article to overcome this obstacle by introducing a new Monte Carlo-type numerical approximation method for high-dimensional BSDEs and by proving that this Monte Carlo-type numerical approximation method does indeed overcome the curse of dimensionality in the approximative computation of solution paths of BSDEs.
翻译:后向偏向偏差方程式( BSDEs) 属于当今最经常研究的随机分析和计算偏差方程式( BSDEs ) 。 应用中的 BSDE 通常不是线性和高维。 在几乎所有情况下,这类非线性高的 BSDEs 都无法明确解决,而且它一直是并且仍然是研究的一个非常活跃的课题, 用于设计和分析数字近差方法, 以近似地解解非线性高维度的 BSDE 方程式。 尽管科学文献中有大量研究文章分析非线性 BSDEs 的数值近似方法, 但直到今天科学文献中还没有使用过数字近似方法。 在非线性 BSDE 方公式的数值近似近似性近似性上, 以 $nvareplon_\ pal- int 的双向直径直径方法( 0,\ liver=xxxxxxxxxxxxxxx) 的计算操作次数, 数字接近性近似直径直径直线性方法, 递性递化的Bxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx