We study the recovery of functions in the uniform norm based on function evaluations. We obtain worst case error bounds for general classes of functions in terms of the best $L_2$-approximation from a given nested sequence of subspaces combined with bounds on the the Christoffel function of these subspaces. Besides an explicit bound, we obtain that linear algorithms using $n$ samples are optimal up to a factor $\sqrt{n}$ among all algorithms using arbitrary linear information. Moreover, our results imply that linear sampling algorithms are optimal up to a constant factor for many reproducing kernel Hilbert spaces. We also discuss results for approximation in more general seminorms, including $L_p$-approximation.
翻译:暂无翻译