We propose a deep neural-operator framework for a general class of probability models. Under global Lipschitz conditions on the operator over the entire Euclidean space-and for a broad class of probabilistic models-we establish a universal approximation theorem with explicit network-size bounds for the proposed architecture. The underlying stochastic processes are required only to satisfy integrability and general tail-probability conditions. We verify these assumptions for both European and American option-pricing problems within the forward-backward SDE (FBSDE) framework, which in turn covers a broad class of operators arising from parabolic PDEs, with or without free boundaries. Finally, we present a numerical example for a basket of American options, demonstrating that the learned model produces optimal stopping boundaries for new strike prices without retraining.
翻译:我们提出了一种适用于一般概率模型类别的深度神经算子框架。在算子于整个欧几里得空间满足全局Lipschitz条件的前提下,并针对广泛的概率模型类别,我们为该架构建立了具有明确网络规模界限的通用逼近定理。所涉及的随机过程仅需满足可积性及一般尾部概率条件。我们在前向-后向随机微分方程(FBSDE)框架下验证了这些假设对于欧式和美式期权定价问题的适用性,该框架进而涵盖了由抛物型偏微分方程(无论是否存在自由边界)产生的一大类算子。最后,我们通过一个美式篮子期权的数值算例,展示了所学习模型能够在不重新训练的情况下为新行权价生成最优停时边界。