In this paper, we propose a PAC-Bayesian \textit{a posteriori} parameter selection scheme for adaptive regularized regression in Hilbert scales under general, unknown source conditions. We demonstrate that our approach is adaptive to misspecification, and achieves the optimal learning rate under subgaussian noise. Unlike existing parameter selection schemes, the computational complexity of our approach is independent of sample size. We derive minimax adaptive rates for a new, broad class of Tikhonov-regularized learning problems under general, misspecified source conditions, that notably do not require any conventional a priori assumptions on kernel eigendecay. Using the theory of interpolation, we demonstrate that the spectrum of the Mercer operator can be inferred in the presence of "tight" $L^{\infty}$ embeddings of suitable Hilbert scales. Finally, we prove, that under a $\Delta_2$ condition on the smoothness index functions, our PAC-Bayesian scheme can indeed achieve minimax rates. We discuss applications of our approach to statistical inverse problems and oracle-efficient contextual bandit algorithms.
翻译:在本文中,我们提出了一个PAC-Bayesian \ textit{ a posiori}参数选择方案,用于在一般、未知的来源条件下调整Hilbert级的正常回归。我们证明我们的方法适应了偏差,并在亚银噪音下实现了最佳学习率。与现有的参数选择方案不同,我们方法的计算复杂性不取决于抽样规模。我们为新的、广泛的Tikhonov-正规学习问题在一般、错误的源条件下得出迷你最大适应率,特别是不需要在内核egendecay上有任何常规的先验假设。我们利用内推理论,表明在“近” $L ⁇ infty} 嵌入适当的Hilbert 比例表时,可以推断Mercer经营者的频谱。最后,我们证明,在光滑指数函数上一个$\Delta_2美元的条件下,我们的PAC-Bayesian 计划确实可以达到迷你轴率。我们讨论我们如何应用统计反向问题和高压断节背景背景的语段算算法。