Although deep neural networks are capable of achieving performance superior to humans on various tasks, they are notorious for requiring large amounts of data and computing resources, restricting their success to domains where such resources are available. Metalearning methods can address this problem by transferring knowledge from related tasks, thus reducing the amount of data and computing resources needed to learn new tasks. We organize the MetaDL competition series, which provide opportunities for research groups all over the world to create and experimentally assess new meta-(deep)learning solutions for real problems. In this paper, authored collaboratively between the competition organizers and the top-ranked participants, we describe the design of the competition, the datasets, the best experimental results, as well as the top-ranked methods in the NeurIPS 2021 challenge, which attracted 15 active teams who made it to the final phase (by outperforming the baseline), making over 100 code submissions during the feedback phase. The solutions of the top participants have been open-sourced. The lessons learned include that learning good representations is essential for effective transfer learning.
翻译:虽然深层神经网络能够在各种任务上取得优于人类的业绩,但它们因需要大量数据和计算资源而臭名昭著,它们要求大量数据和计算资源,将其成功限制在有这些资源的领域。代用学习方法可以通过转让相关任务的知识来解决这一问题,从而减少数据和计算新任务所需要的资源数量。我们组织了MetaDL竞争系列,为世界各地的研究团体创造和实验性地评估解决实际问题的新元(深)学习解决方案提供了机会。在本文中,竞争组织者与最高级参与者合作撰写了论文,我们描述了竞争的设计、数据集、最佳试验结果以及NeurIPS 2021号挑战中最高级的方法,吸引了15个活跃的团队进入最后阶段(超过基线),在反馈阶段提供了100多份代码文件。顶级参与者的解决方案是公开来源的。我们学到的教训包括学习良好表现对于有效转让学习至关重要。