Boosting is a fundamental approach in machine learning that enjoys both strong theoretical and practical guarantees. At a high-level, boosting algorithms cleverly aggregate weak learners to generate predictions with arbitrarily high accuracy. In this way, boosting algorithms convert weak learners into strong ones. Recently, Brukhim et al. extended boosting to the online agnostic binary classification setting. A key ingredient in their approach is a clean and simple reduction to online convex optimization, one that efficiently converts an arbitrary online convex optimizer to an agnostic online booster. In this work, we extend this reduction to multiclass problems and give the first boosting algorithm for online agnostic mutliclass classification. Our reduction also enables the construction of algorithms for statistical agnostic, online realizable, and statistical realizable multiclass boosting.
翻译:推动是机器学习的一个基本方法,它既享有强大的理论保障,也享有强大的实践保障。 在高层次上,提升算法,巧妙地将弱学生聚集在一起,以产生任意高精确度的预测。 如此,促进算法将弱学生转换为强者。 最近, Brukhim 等人( Brukhim et al.) 扩大了在线不可知二进制分类设置的提升。 他们的方法中的一个关键要素是简单简化在线 convex优化, 将任意的在线螺旋线优化器有效转换为不可知的在线推进器。 在这项工作中,我们将这一削减扩大到多级问题,并首次为在线不可知性肌肉分类提供增强算法。 我们的削减还有助于构建统计性、在线可实现和统计性可实现的多级推进器的算法。