Continual learning involves training neural networks incrementally for new tasks while retaining the knowledge of previous tasks. However, efficiently fine-tuning the model for sequential tasks with minimal computational resources remains a challenge. In this paper, we propose Task Incremental Continual Learning (TI-CL) of audio classifiers with both parameter-efficient and compute-efficient Audio Spectrogram Transformers (AST). To reduce the trainable parameters without performance degradation for TI-CL, we compare several Parameter Efficient Transfer (PET) methods and propose AST with Convolutional Adapters for TI-CL, which has less than 5% of trainable parameters of the fully fine-tuned counterparts. To reduce the computational complexity, we introduce a novel Frequency-Time factorized Attention (FTA) method that replaces the traditional self-attention in transformers for audio spectrograms. FTA achieves competitive performance with only a factor of the computations required by Global Self-Attention (GSA). Finally, we formulate our method for TI-CL, called Adapter Incremental Continual Learning (AI-CL), as a combination of the "parameter-efficient" Convolutional Adapter and the "compute-efficient" FTA. Experiments on ESC-50, SpeechCommandsV2 (SCv2), and Audio-Visual Event (AVE) benchmarks show that our proposed method prevents catastrophic forgetting in TI-CL while maintaining a lower computational budget.
翻译:持续学习涉及为新任务逐步培训神经网络,同时保留对以往任务的了解。然而,以最低计算资源对顺序任务模式的模式进行高效的微调仍是一项挑战。在本文件中,我们提议对具有参数效率和计算效率高的音频光变异器(AST)的音频分解器(TI-CL)进行任务递增持续学习(TI-CL)。为了减少对TI-CL进行不降低性能的可训练参数,我们比较了几项参数高效传输(PET)方法,并与具有低于完全微调对应方可训练参数5%的TI-CLL(TI-CL)革命调整适应者提出AST。为了降低计算复杂性,我们采用了新的时间因素化关注(TI-TIC)方法,取代音频光谱变异器中传统自留力。为了实现竞争性业绩,我们只使用全球自留(GSA)要求的计算因素。最后,我们为TI-CL(称为调整性递增持续学习(AI-CLL))提出了AST方法,这还算方法,它比完全精确调整了完全调整了可训练参数参数。我们预算的ISS-CSAL-C-C-C-C-C-CS-C-C-CRal-C-CFAL-C-C-C-C-C-C-C-C-CRisal-CRisal-CS-CRisal-CFS-CS-CS-CS-CS-CS-CFisal-C-CFisal-CRisal-C-C-C-C-CRisal-CS-C-CSBR-CS-C-CS-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-CFSBRisal-Cisal-Cisal-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-CS-C-CS-C-C-C-C-C-C-</s>