The measure of most robust machine learning methods is reweighted. To overcome the optimization difficulty of the implicitly reweighted robust methods (including modifying loss functions and objectives), we try to use a more direct method: explicitly iteratively reweighted method to handle noise (even heavy-tailed noise and outlier) robustness. In this paper, an explicitly iterative reweighted framework based on two kinds of kernel based regression algorithm (LS-SVR and ELM) is established, and a novel weight selection strategy is proposed at the same time. Combining the proposed weight function with the iteratively reweighted framework, we propose two models iteratively reweighted least squares support vector machine (IRLS-SVR) and iteratively reweighted extreme learning machine (IRLS-ELM) to implement robust regression. Different from the traditional explicitly reweighted robust methods, we carry out multiple reweighted operations in our work to further improve robustness. The convergence and approximability of the proposed algorithms are proved theoretically. Moreover, the robustness of the algorithm is analyzed in detail from many angles. Experiments on both artificial data and benchmark datasets confirm the validity of the proposed methods.
翻译:测量最稳健的机器学习方法的方法是重新加权的。为了克服隐含的重新加权的稳健方法(包括修改损失功能和目标)的最大困难,我们试图使用一种更直接的方法:明确迭代再加权方法来处理噪音(甚至重尾噪音和外缘)的稳健性。在本文中,根据两种以内核为基础的回归算法(LS-SVR和ELM),建立了一个明确的迭代再加权框架,同时提出了新的加权选择战略。将拟议的加权函数与迭代再加权框架合并在一起,我们提出了两种模式,即反复迭重的最小正方支持矢量机器(IRLS-SVR)和迭代再加权的极端学习机器(IRLS-ELM),以实施稳健健的回归。不同于传统的明确再加权稳健方法,我们在工作中进行了多项重加权操作,以进一步提高稳健性。拟议算法的趋同和相近性得到了理论上的证明。此外,还从许多角度对算法的稳健性进行了详细分析。关于人造数据和基准数据设置的实验证实了。