We investigate non-adaptive methods of deep ReLU neural network approximation of the solution $u$ to parametric and stochastic elliptic PDEs with lognormal inputs on non-compact set $\mathbb{R}^\infty$. The approximation error is measured in the norm of the Bochner space $L_2(\mathbb{R}^\infty, V, \gamma)$, where $\gamma$ is the tensor product standard Gaussian probability on $\mathbb{R}^\infty$ and $V$ is the energy space. The approximation is based on an $m$-term truncation of the Hermite generalized polynomial chaos expansion (gpc) of $u$. Under a certain assumption on $\ell_q$-summability condition for lognormal inputs ($0< q <\infty$), we proved that for every integer $n > 1$, one can construct a non-adaptive compactly supported deep ReLU neural network $\boldsymbol{\phi}_n$ of size not greater than $n$ on $\mathbb{R}^m$ with $m = \mathcal{O} (n/\log n)$, having $m$ outputs so that the summation constituted by replacing polynomials in the $m$-term truncation of Hermite gpc expansion by these $m$ outputs approximates $u$ with an error bound $\mathcal{O}\left(\left(n/\log n\right)^{-1/q}\right)$. This error bound is comparable to the error bound of the best approximation of $u$ by $n$-term truncations of Hermite gpc expansion which is $\mathcal{O}(n^{-1/q})$. We also obtained some results on similar problems for parametric and stochastic elliptic PDEs with affine inputs, based on the Jacobi and Taylor gpc expansions.
翻译:我们调查了非适应方法的深 ReLU 神经网络近似值, 溶液为 $u$ to parathbb{R ⁇ infty$ 和 $V$ 是能源空间。 近似值以非复合集成 $\ mathb{R ⁇ infty$ 的正统方法测量。 近似误差以Bochner 空间 $L2\\\ mathb{R ⁇ }R ⁇ infty$ ($\ gammamamam$) 的常规值衡量, 其中, 每整价美元=1美元(R ⁇ intivty) 的概率概率, 美元是能源空间。 近似近似值基于 美元 ell_q@q_inftytyty, 以每整价美元( $%) 的直径直径( 美元/ 美元) 直径( 直径=美元) 直径网络的直径直径=美元( 直径=美元/直径=美元) 直径( 直径=美元) 直径方=平方=美元的直径( 直径方=平方=平方=美元)