We consider $L^2$-approximation on weighted reproducing kernel Hilbert spaces of functions depending on infinitely many variables. We focus on unrestricted linear information, admitting evaluations of arbitrary continuous linear functionals. We distinguish between ANOVA and non-ANOVA spaces, where, by ANOVA spaces, we refer to function spaces whose norms are induced by an underlying ANOVA function decomposition. In ANOVA spaces, we prove that there is an optimal algorithm to solve the approximation problem using linear information. This way, we can determine the exact polynomial convergence rate of $n$-th minimal worst-case errors. For non-ANOVA spaces, we also establish upper and lower error bounds. Even though the bounds do not match in this case, they reveal that for weights with a moderate decay behavior, the convergence rate of $n$-th minimal errors is strictly higher in ANOVA than in non-ANOVA spaces.
翻译:我们考虑对根据无限多变量生成轴心Hilbert函数空间的加权再生产使用$L$2美元。 我们注重不受限制的线性信息, 接受对任意连续线性功能的评估。 我们区分ANOVA和非ANOVA空间。 我们用ANOVA空间来区分ANOVA空间和非ANOVA空间, 我们指的是其规范是由ANOVA函数分解引发的功能空间。 在ANOVA空间, 我们用线性信息来证明存在解决近似问题的最佳算法。 这样, 我们就可以确定非ANOVA空间的精确多数值合并率, 即最低值为美元。 对于非ANOVA空间, 我们还设定了上下误差界限。 即使这一边的界限与这个案例不一致, 它们表明,对于中度衰减行为重量而言, ANOVA空间中值为美元第五位最低误差的趋近率率严格高于非ANOVA空间。