We study the problem of estimating the derivatives of a regression function, which has a wide range of applications as a key nonparametric functional of unknown functions. Standard analysis may be tailored to specific derivative orders, and parameter tuning remains a daunting challenge particularly for high-order derivatives. In this article, we propose a simple plug-in kernel ridge regression (KRR) estimator in nonparametric regression with random design that is broadly applicable for multi-dimensional support and arbitrary mixed-partial derivatives. We provide a non-asymptotic analysis to study the behavior of the proposed estimator in a unified manner that encompasses the regression function and its derivatives, leading to two error bounds for a general class of kernels under the strong $L_\infty$ norm. In a concrete example specialized to kernels with polynomially decaying eigenvalues, the proposed estimator recovers the minimax optimal rate up to a logarithmic factor for estimating derivatives of functions in H\"older and Sobolev classes. Interestingly, the proposed estimator achieves the optimal rate of convergence with the same choice of tuning parameter for any order of derivatives. Hence, the proposed estimator enjoys a \textit{plug-in property} for derivatives in that it automatically adapts to the order of derivatives to be estimated, enabling easy tuning in practice. Our simulation studies show favorable finite sample performance of the proposed method relative to several existing methods blue and corroborate the theoretical findings on its minimax optimality.
翻译:我们研究如何估计回归函数的衍生物的问题,该函数具有广泛的应用范围,是未知函数的关键非参数功能。标准分析可以针对具体的衍生物订单进行定制,而参数调整仍然是一项艰巨的挑战,对于高阶衍生物来说尤其如此。在本篇文章中,我们提出一个简单插插内心脊回归(KRR)估计法,其随机设计可广泛适用于多维支持和任意混合部分衍生物,以非参数的方式进行估算。我们提供非简易分析,以统一的方式研究拟议的估算器的行为,其中包括回归函数及其衍生物,导致在坚固的$L ⁇ infty值规范下对一般类核心舱进行两个错误的界限。在一个具体的例子中,建议估算器将小型最佳回归率恢复到一个对H\'olderderer and Sobollev 类各项功能的优化衍生物的对等值。有趣的是,拟议估算器的相对精确精确度的相对匹配度将达到一个最优化的比值范围, 并显示我们当前估算值的精确性衍生物的比值排序。