Continuously learning new classes without catastrophic forgetting is a challenging problem for on-device environmental sound classification given the restrictions on computation resources (e.g., model size, running memory). To address this issue, we propose a simple and efficient continual learning method. Our method selects the historical data for the training by measuring the per-sample classification uncertainty. Specifically, we measure the uncertainty by observing how the classification probability of data fluctuates against the parallel perturbations added to the classifier embedding. In this way, the computation cost can be significantly reduced compared with adding perturbation to the raw data. Experimental results on the DCASE 2019 Task 1 and ESC-50 dataset show that our proposed method outperforms baseline continual learning methods on classification accuracy and computational efficiency, indicating our method can efficiently and incrementally learn new classes without the catastrophic forgetting problem for on-device environmental sound classification.
翻译:由于计算资源的限制(例如模型大小、运行记忆),不断学习不发生灾难性遗忘的新班级是一个挑战性的问题。为了解决这个问题,我们建议一种简单有效的持续学习方法。我们的方法通过测量每类分类的不确定性来选择培训的历史数据。具体地说,我们通过观察数据分类的概率如何与分类器嵌入的平行扰动发生波动来测量不确定性。这样,计算成本与原始数据增加扰动相比可以大大降低。DCASE 2019任务1和 ESC-50数据集的实验结果显示,我们拟议的方法比分类准确性和计算效率的基线持续学习方法要好,表明我们的方法可以高效和渐进地学习新班,而不会在设计环境健全分类时出现灾难性的遗忘问题。