We construct a least squares approximation method for the recovery of complex-valued functions from a reproducing kernel Hilbert space on $D \subset \mathbb{R}^d$. The nodes are drawn at random for the whole class of functions and the error is measured in $L_2(D,\varrho_D)$. We prove worst-case recovery guarantees by explicitly controlling all the involved constants. This leads to new preasymptotic recovery bounds with high probability for the error of Hyperbolic Fourier Regression on multivariate data. In addition, we further investigate its counterpart Hyperbolic Wavelet Regression also based on least-squares to recover non-periodic functions from random samples. Finally, we reconsider the analysis of a cubature method based on plain random points with optimal weights and reveal near-optimal worst-case error bounds with high probability. It turns out that this simple method can compete with the quasi-Monte Carlo methods in the literature which are based on lattices and digital nets.
翻译:我们用 $D\ subset \ mathbb{R ⁇ d$ 构建了一个最小方形近似法, 用于从复制的 Hilbert 空间复制复杂价值函数, 用 $D \ subset \ mathbb{R ⁇ d$ 。 节点是随机抽取的, 错误用$L_ 2( D,\ varrho_ D) 来测量。 我们通过明确控制所有相关的常数来证明最坏的恢复保证。 这导致新的精密修复线, 极有可能在多变式数据上出现超双曲线反射错误。 此外, 我们还进一步调查对应的双曲线回射法, 以便从随机样本中回收非周期性函数。 最后, 我们重新考虑基于最优权重的简单随机点的肿瘤方法分析, 并揭示出近最佳最坏的误差, 概率很高。 这简单方法可以与基于 lattes 和数字 Net 的文献中的准蒙特 方法相竞争 。