A new attention-based model for the gradient boosting machine (GBM) called AGBoost (the attention-based gradient boosting) is proposed for solving regression problems. The main idea behind the proposed AGBoost model is to assign attention weights with trainable parameters to iterations of GBM under condition that decision trees are base learners in GBM. Attention weights are determined by applying properties of decision trees and by using the Huber's contamination model which provides an interesting linear dependence between trainable parameters of the attention and the attention weights. This peculiarity allows us to train the attention weights by solving the standard quadratic optimization problem with linear constraints. The attention weights also depend on the discount factor as a tuning parameter, which determines how much the impact of the weight is decreased with the number of iterations. Numerical experiments performed for two types of base learners, original decision trees and extremely randomized trees with various regression datasets illustrate the proposed model.
翻译:为解决回归问题,提出了称为 AGBoost (关注基梯度增压) 的梯度助推机(GBM) 的新的关注模型。 拟议的 AGBoost 模型的主要想法是,在决定树是 GBM 中基本学习者的条件下,为GBM 的迭代分配带有可训练参数的注意权重。 注意权重是通过应用决策树的特性以及使用Huber 的污染模型来决定的。 该模型在可训练的注意参数和关注重力之间提供了一种有趣的线性依赖性。 这一特殊性使我们能够通过解决标准的二次优化标准问题和线性限制来训练注意力的重量。 注意权重还取决于作为调理参数的折扣系数, 该参数决定着重量的影响随着迭代数的增加而减少多少。 对两类基本学习者、原始决定树和极随机化的树木以及各种回归数据集进行的数字性实验。