We consider a general functional linear regression model, allowing for both functional and high-dimensional vector covariates. Furthermore, the proposed model can accommodate discretized observations of functional variables and different reproducing kernel Hilbert spaces (RKHS) for the functional regression coefficients. Based on this general setting, we propose a penalized least squares approach in RKHS, where the penalties enforce both smoothness and sparsity on the functional estimators. We also show that the excess prediction risk of our estimators is minimax optimal under this general model setting. Our analysis reveals an interesting phase transition phenomenon and the optimal excess risk is determined jointly by the sparsity and the smoothness of the functional regression coefficients. We devise a novel optimization algorithm, simultaneously handling the smoothness and sparsity penalization.
翻译:我们考虑的是通用功能线性回归模型,允许功能性和高维矢量共变。此外,拟议的模型可以包含功能变量的分散观测和功能回归系数的不同复制核心赫伯特空间(RKHS ) 。基于这一总体背景,我们建议了RKHS中受罚的最低方位方法,该方法的处罚对功能估算器实施平滑和宽度。我们还表明,在这一总体模型设置下,我们估算器的超额预测风险是最小最大最佳的。我们的分析揭示了有趣的阶段过渡现象,而最佳超额风险是由功能回归系数的宽度和平稳性共同决定的。我们设计了一种新型优化算法,同时处理平稳和宽度处罚。