Under the reproducing kernel Hilbert spaces (RKHS), we consider the penalized least-squares of the partially functional linear models (PFLM), whose predictor contains both functional and traditional multivariate parts, and the multivariate part allows a divergent number of parameters. From the non-asymptotic point of view, we focus on the rate-optimal upper and lower bounds of the prediction error. An exact upper bound for the excess prediction risk is shown in a non-asymptotic form under a more general assumption known as the effective dimension to the model, by which we also show the prediction consistency when the number of multivariate covariates $p$ slightly increases with the sample size $n$. Our new finding implies a trade-off between the number of non-functional predictors and the effective dimension of the kernel principal components to ensure prediction consistency in the increasing-dimensional setting. The analysis in our proof hinges on the spectral condition of the sandwich operator of the covariance operator and the reproducing kernel, and on sub-Gaussian and Berstein concentration inequalities for the random elements in Hilbert space. Finally, we derive the non-asymptotic minimax lower bound under the regularity assumption of the Kullback-Leibler divergence of the models.
翻译:在复制核心Hilbert空间(RKHS)时,我们考虑了部分功能性线性模型(PFLM)中最受处罚的最小部分,其预测器含有功能性和传统多变量部分,而多变量部分允许不同的参数数。从非无保护的角度,我们侧重于预测误差的速率最佳上限和下下限。超额预测风险的准确上限以非失禁形式显示,其一般假设称为该模型的有效维度,我们通过这一假设也显示了预测的一致性。当多变量共变数数数的数量与样本大小略有增加时,我们的新发现意味着在非功能性预测器的数量与主内核部分的有效维度之间进行权衡,以确保预测在不断增长的环境下的一致性。我们的证据分析取决于调味操作器和再生内核的调频条件,以及当多个多变量共变数的美元与传统的多价差时,我们展示了预测的一致性。最后,在Hilbert的空基底置模型中,我们得出了固定的无源空间差。