Covariate shift in regression problems and the associated distribution mismatch between training and test data is a commonly encountered phenomenon in machine learning. In this paper, we extend recent results on nonparametric convergence rates for i.i.d. data to Markovian dependence structures. We demonstrate that under H\"older smoothness assumptions on the regression function, convergence rates for the generalization risk of a Nadaraya-Watson kernel estimator are determined by the similarity between the invariant distributions associated to source and target Markov chains. The similarity is explicitly captured in terms of a bandwidth-dependent similarity measure recently introduced in Pathak, Ma and Wainwright [ICML, 2022]. Precise convergence rates are derived for the particular cases of finite Markov chains and spectral gap Markov chains for which the similarity measure between their invariant distributions grows polynomially with decreasing bandwidth. For the latter, we extend the notion of a distribution transfer exponent from Kpotufe and Martinet [Ann. Stat., 49(6), 2021] to kernel transfer exponents of uniformly ergodic Markov chains in order to generate a rich class of Markov kernel pairs for which convergence guarantees for the covariate shift problem can be formulated.
翻译:暂无翻译