Given a {features, target} dataset, we introduce an incremental algorithm that constructs an aggregate regressor, using an ensemble of neural networks. It is well known that ensemble methods suffer from the multicollinearity issue, which is the manifestation of redundancy arising mainly due to the common training-dataset. In the present incremental approach, at each stage we optimally blend the aggregate regressor with a newly trained neural network under a convexity constraint which, if necessary, induces negative correlations. Under this framework, collinearity issues do not arise at all, rendering so the method both accurate and robust.
翻译:考虑到 { 属性, 目标 } 数据集, 我们引入了一种递增算法, 利用神经网络的组合组合构建一个综合递减器。 众所周知, 共性方法受到多线性问题的影响, 这是主要由于共同培训数据集引起的冗余的表现。 在目前的递增方法中, 在每一个阶段, 我们优化地将综合递减器与新培训的神经网络混合在一起, 并受共性制约, 必要时, 引发负相关性。 在这个框架内, 共性问题根本不出现, 这使得这种方法既准确又健全。