We consider the statistical inverse problem of recovering a parameter $\theta\in H^\alpha$ from data arising from the Gaussian regression problem \begin{equation*} Y = \mathscr{G}(\theta)(Z)+\varepsilon \end{equation*} with nonlinear forward map $\mathscr{G}:\mathbb{L}^2\to\mathbb{L}^2$, random design points $Z$ and Gaussian noise $\varepsilon$. The estimation strategy is based on a least squares approach under $\Vert\cdot\Vert_{H^\alpha}$-constraints. We establish the existence of a least squares estimator $\hat{\theta}$ as a maximizer for a given functional under Lipschitz-type assumptions on the forward map $\mathscr{G}$. A general concentration result is shown, which is used to prove consistency and upper bounds for the prediction error. The corresponding rates of convergence reflect not only the smoothness of the parameter of interest but also the ill-posedness of the underlying inverse problem. We apply the general model to the Darcy problem, where the recovery of an unknown coefficient function $f$ of a PDE is of interest. For this example, we also provide corresponding rates of convergence for the prediction and estimation errors. Additionally, we briefly discuss the applicability of the general model to other problems.
翻译:我们考虑从高斯回归问题数据中恢复参数 $\theta\in H^\alpha$ 的统计反问题,该问题由非线性前向映射 $\mathscr{G}:\mathbb{L}^2\to\mathbb{L}^2$、随机设计点 $Z$ 与高斯噪声 $\varepsilon$ 构成,其数学模型为 \begin{equation*} Y = \mathscr{G}(\theta)(Z)+\varepsilon \end{equation*}。估计策略基于 $\Vert\cdot\Vert_{H^\alpha}$ 约束下的最小二乘法。在前向映射 $\mathscr{G}$ 满足 Lipschitz 型假设的条件下,我们证明了最小二乘估计量 $\hat{\theta}$ 作为给定泛函极大值的存在性。本文证明了一个广义集中性结果,并利用该结果证明了预测误差的一致性与上界。相应的收敛速率不仅反映了目标参数的光滑性,也体现了底层反问题的不适定性。我们将该通用模型应用于达西问题,该问题关注于偏微分方程未知系数函数 $f$ 的重建。针对此例,我们进一步给出了预测误差与估计误差的对应收敛速率。此外,本文简要讨论了该通用模型在其他问题中的适用性。