Recent years have witnessed enormous progress of online learning. However, a major challenge on the road to artificial agents is concept drift, that is, the data probability distribution would change where the data instance arrives sequentially in a stream fashion, which would lead to catastrophic forgetting and degrade the performance of the model. In this paper, we proposed a new Bilevel Online Deep Learning (BODL) framework, which combine bilevel optimization strategy and online ensemble classifier. In BODL algorithm, we use an ensemble classifier, which use the output of different hidden layers in deep neural network to build multiple base classifiers, the important weights of the base classifiers are updated according to exponential gradient descent method in an online manner. Besides, we apply the similar constraint to overcome the convergence problem of online ensemble framework. Then an effective concept drift detection mechanism utilizing the error rate of classifier is designed to monitor the change of the data probability distribution. When the concept drift is detected, our BODL algorithm can adaptively update the model parameters via bilevel optimization and then circumvent the large drift and encourage positive transfer. Finally, the extensive experiments and ablation studies are conducted on various datasets and the competitive numerical results illustrate that our BODL algorithm is a promising approach.
翻译:近些年来,在线学习取得了巨大的进步。然而,在人造物剂的道路上,一个重大挑战是概念漂移,即数据概率分布将改变数据概率分配,数据概率分配将随着数据实例以流式方式依次出现而变化,从而导致灾难性的遗忘和模型性能的退化。在本文中,我们提出了一个新的双级在线深层学习框架(BODL)框架,将双级优化战略和在线共通性分类器结合起来。在BODL算法中,我们使用一个混合分类器,利用深层神经网络中不同隐蔽层的输出来建立多个基级分类器,基级分类器的重要重量根据指数梯度梯度下降法以在线方式更新。此外,我们运用类似的限制来克服在线共通性框架的趋同问题。然后,一个有效的概念流学探测机制,利用分类器的误率来监测数据概率分布的变化。在发现概念漂移时,我们的BODL算法可以通过双级优化来适应更新模型参数,然后绕过大流和鼓励积极的转移。最后,我们的广泛实验和ABlational 算法将展示各种数据设置的数字结果。