In this article, we introduce a kernel-based consensual aggregation method for regression problems. We aim to flexibly combine individual regression estimators $r_1,r_2,\ldots,r_M$ using a weighted average where the weights are defined based on some kernel function. It may be seen as a kernel smoother method implemented on the features of predictions, given by all the individual estimators, instead of the original inputs. This work extends the context of Biau et al. (2016) to a more general kernel-based framework. We show that this configuration asymptotically inherits the consistency property of the basic consistent estimators. Moreover, we propose to numerically learn the key parameter of the method using a gradient descent algorithm for a suitable choice of kernel functions instead of using the classical grid search algorithm. The numerical experiments carried out on several simulated and real datasets suggest that the performance of the method is improved with the introduction of kernel functions.
翻译:在本篇文章中,我们引入了一种基于内核的回归问题共识聚合方法。 我们的目标是在根据内核函数确定加权值时,使用加权平均值,灵活地将个人回归估计值 $r_1,r_2,\ldots,r_M$结合起来。 它可以被视为一种内核平滑法,根据预测的特性,由所有个人估计者提供,而不是原始输入。 这项工作将Biau et al. 2016的背景扩展至一个更普遍的内核框架。 我们显示,这种配置在瞬间继承了基本一致估计值的一致性属性。 此外,我们提议用梯度下行算法来从数字上学习方法的关键参数,以便适当选择内核功能,而不是使用古典的网格搜索算法。 在几个模拟和实际数据集上进行的数实验表明,随着内核功能的引入,该方法的性能得到了改进。