The advances in deep learning have enabled machine learning methods to outperform human beings in various areas, but it remains a great challenge for a well-trained model to quickly adapt to a new task. One promising solution to realize this goal is through meta-learning, also known as learning to learn, which has achieved promising results in few-shot learning. However, current approaches are still enormously different from human beings' learning process, especially in the ability to extract structural and transferable knowledge. This drawback makes current meta-learning frameworks non-interpretable and hard to extend to more complex tasks. We tackle this problem by introducing concept discovery to the few-shot learning problem, where we achieve more effective adaptation by meta-learning the structure among the data features, leading to a composite representation of the data. Our proposed method Concept-Based Model-Agnostic Meta-Learning (COMAML) has been shown to achieve consistent improvements in the structured data for both synthesized datasets and real-world datasets.
翻译:深度学习的进步使得机器学习方法在许多领域中的表现优于人类,但仍然很难训练好的模型快速适应新任务。要实现这一目标的一种有前途的解决方案是元学习,也称为学习学习,它在少样本学习中取得了有前途的结果。然而,当前的方法与人类学习过程仍然有很大的差异,特别是在提取结构化和可转移的知识的能力方面。这个缺点使得当前的元学习框架难以解释和扩展到更复杂的任务。我们通过将概念发现引入到少样本学习问题中来解决这个问题,其中,我们通过元学习数据特征之间的结构来实现更有效的适应,从而导致数据的复合表示。我们提出的概念为基础的模型不可知元学习(COMAML)方法已经在合成数据集和实际数据集的结构化数据中显示出了一致的改进。