Function-on-function linear regression is important for understanding the relationship between the response and the predictor that are both functions. In this article, we propose a reproducing kernel Hilbert space approach to function-on-function linear regressionvia the penalised least square, regularized by the thin-plate spline smoothness penalty. The minimax optimal convergence rate of our estimator of the coefficient function is studied. We derive the Bahadur representation, which allows us to propose statistical inference methods using bootstrap and the convergence of Banach-valued random variables in the sup-norm. We illustrate our method and verify our theoretical results via simulated data experiments and a real data example.
翻译:功能对功能的线性回归对于理解反应和预测器之间的关系非常重要。 在本条中,我们建议对功能对功能对功能对功能的直线回归采取复制的内核Hilbert空间方法,通过薄板样板滑滑性罚款规范的处罚最小方形。正在研究我们系数函数估计符的最小最大最佳趋同率。我们得出巴哈杜尔表示法,使我们能够提出使用靴子陷阱的统计推论方法,以及Banach估价的随机变数在高温中趋同。我们通过模拟数据实验和真实数据范例来说明我们的方法并核实我们的理论结果。