Future deep learning systems call for techniques that can deal with the evolving nature of temporal data and scarcity of annotations when new problems occur. As a step towards this goal, we present FUSION (Few-shot UnSupervIsed cONtinual learning), a learning strategy that enables a neural network to learn quickly and continually on streams of unlabelled data and unbalanced tasks. The objective is to maximise the knowledge extracted from the unlabelled data stream (unsupervised), favor the forward transfer of previously learnt tasks and features (continual) and exploit as much as possible the supervised information when available (few-shot). The core of FUSION is MEML - Meta-Example Meta-Learning - that consolidates a meta-representation through the use of a self-attention mechanism during a single inner loop in the meta-optimisation stage. To further enhance the capability of MEML to generalise from few data, we extend it by creating various augmented surrogate tasks and by optimising over the hardest. An extensive experimental evaluation on public computer vision benchmarks shows that FUSION outperforms existing state-of-the-art solutions both in the few-shot and continual learning experimental settings.
翻译:未来深层学习系统需要能够应对时间数据不断变化的性质和新问题出现时缺乏说明的技术。作为实现这一目标的一个步骤,我们介绍了FUSION(Few-shot unsuperved comproveed continal learning),这是一个使神经网络能够在无标签数据流和不平衡任务流上快速和持续学习的学习战略。目标是尽量扩大从无标签数据流(不受监督)中获取的知识(无标签数据流),赞成将先前学到的任务和特征(连续)提前转移,尽可能利用现有的(few-shot)受监督的信息。FUSION的核心是MEML(ML-Meta-Example Metal-Learning),它通过在元化阶段的单一内部循环中使用自我注意机制来巩固一个元代表。为了进一步提高MEMLML(ML)从少数数据流(无标签数据中普及能力,我们通过建立各种增强的代孕期任务和最优化地利用。关于公共计算机视觉基准的广泛实验性评估显示FUSION在不断的实验环境中学习。