The XGBoost method has many advantages and is especially suitable for statistical analysis of big data, but its loss function is limited to convex functions. In many specific applications, a nonconvex loss function would be preferable. In this paper, I propose a generalized XGBoost method, which requires weaker loss function constraint and involves more general loss functions, including convex loss functions and some non-convex loss functions. Furthermore, this generalized XGBoost method is extended to multivariate loss function to form a more generalized XGBoost method. This method is a multiobjective parameter regularized tree boosting method, which can model multiple parameters in most of the frequently-used parametric probability distributions to be fitted by predictor variables. Meanwhile, the related algorithms and some examples in non-life insurance pricing are given.
翻译:XGBoost 方法有许多优点,特别适合对海量数据进行统计分析,但其损失功能仅限于 convex 函数。在许多具体应用中,非 convex 损失功能比较可取。在本文中,我提议采用通用的 XGBoost 方法,这种方法要求损失功能受较弱的限制,并涉及更一般的损失功能,包括Convex 损失功能和一些非 convex 损失功能。此外,这种通用的 XGBoost 方法扩大到多变量损失功能,形成一种更为普遍的 XGBoost 方法。这种方法是一种多目标参数的正规化树助推法,可以模拟多数经常使用的参数参数分布,以预测变量为基准。同时,还给出了相关的算法和非寿命保险定价中的一些实例。