Gradient boosted trees are competition-winning, general-purpose, non-parametric regressors, which exploit sequential model fitting and gradient descent to minimize a specific loss function. The most popular implementations are tailored to univariate regression and classification tasks, precluding the possibility of capturing multivariate target cross-correlations and applying structured penalties to the predictions. In this paper, we present a computationally efficient algorithm for fitting multivariate boosted trees. We show that multivariate trees can outperform their univariate counterpart when the predictions are correlated. Furthermore, the algorithm allows to arbitrarily regularize the predictions, so that properties like smoothness, consistency and functional relations can be enforced. We present applications and numerical results related to forecasting and control.
翻译:渐渐增强的树木是具有竞争优势的、通用的、非参数的倒退者,它们利用相继模型安装和梯度下降来尽量减少特定损失功能,最受欢迎的执行是针对单向回归和分类任务而设计的,从而排除了捕捉多变量目标交叉关系和对预测适用结构性惩罚的可能性。在本文中,我们提出了一种计算效率高的算法,用于安装多变量增殖的树木。我们表明,多变量树在预测相关时可以优于其单向对应方。此外,算法允许任意调整预测,从而可以执行顺畅、一致性和功能关系等特性。我们提出了与预测和控制有关的应用和数字结果。