This paper examines the asymptotic convergence properties of Lipschitz interpolation methods within the context of bounded stochastic noise. In the first part of the paper, we establish probabilistic consistency guarantees of the classical approach in a general setting and derive upper bounds on the uniform convergence rates. These bounds align with well-established optimal rates of non-parametric regression obtained in related settings and provide new precise upper bounds on the non-parametric regression problem under bounded noise assumptions. Practically, they can serve as a theoretical tool for comparing Lipschitz interpolation to alternative non-parametric regression methods, providing a condition on the behaviour of the noise at the boundary of its support which indicates when Lipschitz interpolation should be expected to asymptotically outperform or underperform other approaches. In the second part, we expand upon these results to include asymptotic guarantees for online learning of dynamics in discrete-time stochastic systems and illustrate their utility in deriving closed-loop stability guarantees of a simple controller. We also explore applications where the main assumption of prior knowledge of the Lipschitz constant is removed by adopting the LACKI framework (Calliess et al. (2020)) and deriving general asymptotic consistency.
翻译:暂无翻译