As progress is made on training machine learning models on incrementally expanding classification tasks (i.e., incremental learning), a next step is to translate this progress to industry expectations. One technique missing from incremental learning is automatic architecture design via Neural Architecture Search (NAS). In this paper, we show that leveraging NAS for incremental learning results in strong performance gains for classification tasks. Specifically, we contribute the following: first, we create a strong baseline approach for incremental learning based on Differentiable Architecture Search (DARTS) and state-of-the-art incremental learning strategies, outperforming many existing strategies trained with similar-sized popular architectures; second, we extend the idea of architecture search to regularize architecture forgetting, boosting performance past our proposed baseline. We evaluate our method on both RF signal and image classification tasks, and demonstrate we can achieve up to a 10% performance increase over state-of-the-art methods. Most importantly, our contribution enables learning from continuous distributions on real-world application data for which the complexity of the data distribution is unknown, or the modality less explored (such as RF signal classification).
翻译:由于在逐步扩大分类任务(即渐进学习)的培训机器学习模式方面取得了进展,下一步是将这一进展转化为行业期望。渐进学习所缺的一个技术是通过神经结构搜索(NAS)实现自动建筑设计。在本文中,我们表明利用NAS实现渐进学习的结果,在分类任务中取得显著的绩效收益。具体地说,我们贡献如下:第一,我们根据不同的建筑搜索(DARTS)和最先进的渐进学习战略,为渐进学习创造一个强有力的基线方法,超过许多经过类似规模广受欢迎的结构培训的现有战略;第二,我们扩大建筑搜索的概念,使建筑正规化,使业绩超过我们提议的基线。我们评估了RF信号和图像分类任务的方法,并证明我们能够在最先进的方法上实现高达10%的绩效增长。最重要的是,我们的贡献有助于从数据传播的复杂性或不太探索的方式(如RF信号分类)上不断发布数据。