The iterative weight update for the AdaBoost machine learning algorithm may be realized as a dynamical map on a probability simplex. When learning a low-dimensional data set this algorithm has a tendency towards cycling behavior, which is the topic of this paper. AdaBoost's cycling behavior lends itself to direct computational methods that are ineffective in the general, non-cycling case of the algorithm. From these computational properties we give a concrete correspondence between AdaBoost's cycling behavior and continued fractions dynamics. Then we explore the results of this correspondence to expound on how the algorithm comes to be in this periodic state at all. What we intend for this work is to be a novel and self-contained explanation for the cycling dynamics of this machine learning algorithm.
翻译:AdaBoost 机器学习算法的迭代重量更新可以作为概率简单化的动态地图实现。 当学习低维数据组时, 这个算法倾向于循环行为, 这是本文的主题。 AdaBoost 的循环行为可以直接使用在算法的一般、非循环化情况下无效的计算方法。 在这些计算属性中, 我们给出了 AdaBoost 循环行为和连续分数动态之间的具体对应。 然后我们探索了这一对应的结果, 以解释算法是如何处于这种周期状态的。 我们打算对这项工作做一个新颖和自成一体的解释, 解释机器学习算法的循环动态 。