We introduce an original method of multidimensional ridge penalization in functional local linear regressions. The nonparametric regression of functional data is extended from its multivariate counterpart, and is known to be sensitive to the choice of $J$, where $J$ is the dimension of the projection subspace of the data. Under multivariate setting, a roughness penalty is helpful for variance reduction. However, among the limited works covering roughness penalty under the functional setting, most only use a single scalar for tuning. Our new approach proposes a class of data-adaptive ridge penalties, meaning that the model automatically adjusts the structure of the penalty according to the data sets. This structure has $J$ free parameters and enables a quadratic programming search for optimal tuning parameters that minimize the estimated mean squared error (MSE) of prediction, and is capable of applying different roughness penalty levels to each of the $J$ basis. The strength of the method in prediction accuracy and variance reduction with finite data is demonstrated through multiple simulation scenarios and two real-data examples. Its asymptotic performance is proved and compared to the unpenalized functional local linear regressions.
翻译:在功能性局部线性回归中,我们引入了最初的多维山脊惩罚方法。功能性数据的非参数回归从多变量对应方扩展,已知对选择美元十分敏感,因为美元是数据预测子空间的维度。在多变量设置下,粗度处罚有助于减少差异。但在功能性设定下,在涵盖粗度处罚的有限工程中,多数只使用单一的标尺来调整。我们的新方法提出了一组数据适应性山脊处罚,这意味着模型自动调整刑罚结构,根据数据集进行调整。这一结构有美元自由参数,并允许对最佳调整参数进行二次方位程序搜索,以尽量减少预测中的估计平均平方差(MSE),并且能够对每个美元基数适用不同的粗度处罚水平。通过多种模拟假设和两个真实数据实例来显示预测准确性和差异减少数据的方法的强度。其功能性表现得到证明,并与未受约束的局部功能性直线性回归相比较。