Previous analysis of regularized functional linear regression in a reproducing kernel Hilbert space (RKHS) typically requires the target function to be contained in this kernel space. This paper studies the convergence performance of divide-and-conquer estimators in the scenario that the target function does not necessarily reside in the underlying RKHS. As a decomposition-based scalable approach, the divide-and-conquer estimators of functional linear regression can substantially reduce the algorithmic complexities in time and memory. We develop an integral operator approach to establish sharp finite sample upper bounds for prediction with divide-and-conquer estimators under various regularity conditions of explanatory variables and target function. We also prove the asymptotic optimality of the derived rates by building the mini-max lower bounds. Finally, we consider the convergence of noiseless estimators and show that the rates can be arbitrarily fast under mild conditions.
翻译:对复制核心Hilbert空间(RKHS) 的正常功能线性回归的先前分析通常要求将目标功能包含在这个核心空间中。本文研究了在目标函数不一定包含在基本RKHS的情景下,分解功能线性回归的合并性表现。作为一种分解可缩放的方法,功能线性回归的分解和分解估计性回归可大量减少时间和记忆中的算法复杂性。我们开发了一个整体操作器方法,在解释变量和目标功能的不同常规条件下,与分解和共控估计者建立精确的抽样,用于预测。我们还证明了通过建立微型最大下限,得出的比率是无孔的最佳。最后,我们考虑了无噪音估计者的趋同性,并表明在温和的条件下,这些比率可以是任意的快速的。