We consider best approximation problems in a nonlinear subset $\mathcal{M}$ of a Banach space of functions $(\mathcal{V},\|\bullet\|)$. The norm is assumed to be a generalization of the $L^2$-norm for which only a weighted Monte Carlo estimate $\|\bullet\|_n$ can be computed. The objective is to obtain an approximation $v\in\mathcal{M}$ of an unknown function $u \in \mathcal{V}$ by minimizing the empirical norm $\|u-v\|_n$. We consider this problem for general nonlinear subsets and establish error bounds for the empirical best approximation error. Our results are based on a restricted isometry property (RIP) which holds in probability and is independent of the nonlinear least squares setting. Several model classes are examined where analytical statements can be made about the RIP and the results are compared to existing sample complexity bounds from the literature. We find that for well-studied model classes our general bound is weaker but exhibits many of the same properties as these specialized bounds. Notably, we demonstrate the advantage of an optimal sampling density (as known for linear spaces) for sets of functions with sparse representations.
翻译:我们考虑的是非线性子集$\mathcal{M}$(mathcal{M}$) 中的最佳近似问题。 我们考虑的是普通非线性子集的问题, 并为实验性最佳近似错误设定错误界限。 我们的结果基于一种限制的偏差属性(RIP), 它可能存在并且独立于非线性最小方位设置。 我们研究了几个模型类,其中可以对RIP进行分析性陈述,结果与文献中现有的样本复杂性界限进行比较。 我们发现,对于经过充分研究的模型类别来说,我们一般的非线性子集是较弱的,但展示了许多已知的、具有最佳精确度的精确度特征。