Optimal recovery is a mathematical framework for learning functions from observational data by adopting a worst-case perspective tied to model assumptions on the functions to be learned. Working in a finite-dimensional Hilbert space, we consider model assumptions based on approximability and observation inaccuracies modeled as additive errors bounded in $\ell_2$. We focus on the local recovery problem, which amounts to the determination of Chebyshev centers. Earlier work by Beck and Eldar presented a semidefinite recipe for the determination of Chebyshev centers. The result was valid in the complex setting only, but not necessarily in the real setting, since it relied on the S-procedure with two quadratic constraints, which offers a tight relaxation only in the complex setting. Our contribution consists in proving that this semidefinite recipe is exact in the real setting, too, at least in the particular instance where the quadratic constraints involve orthogonal projectors. Our argument exploits a previous work of ours, where exact Chebyshev centers were obtained in a different way. We conclude by stating some open questions and by commenting on other recent results in optimal recovery.
翻译:暂无翻译