In order to quickly adapt to new data, few-shot learning aims at learning from few examples, often by using already acquired knowledge. The new data often differs from the previously seen data due to a domain shift, that is, a change of the input-target distribution. While several methods perform well on small domain shifts like new target classes with similar inputs, larger domain shifts are still challenging. Large domain shifts may result in high-level concepts that are not shared between the original and the new domain. However, low-level concepts like edges in images might still be shared and useful. For cross-domain few-shot learning, we suggest representation fusion to unify different abstraction levels of a deep neural network into one representation. We propose Cross-domain Hebbian Ensemble Few-shot learning (CHEF), which achieves representation fusion by an ensemble of Hebbian learners acting on different layers of a deep neural network that was trained on the original domain. On the few-shot datasets miniImagenet and tieredImagenet, where the domain shift is small, CHEF is competitive with state-of-the-art methods. On cross-domain few-shot benchmark challenges with larger domain shifts, CHEF establishes novel state-of-the-art results in all categories. We further apply CHEF on a real-world cross-domain application in drug discovery. We consider a domain shift from bioactive molecules to environmental chemicals and drugs with twelve associated toxicity prediction tasks. On these tasks, that are highly relevant for computational drug discovery, CHEF significantly outperforms all its competitors. Github: https://github.com/ml-jku/chef
翻译:为了迅速适应新数据,少见的学习旨在从几个例子中学习,通常使用已经获得的知识。新数据往往与以往看到的数据不同,因为领域变化,即改变输入目标分布。虽然一些方法在小型域变换方面表现良好,如新目标类和类似投入,但更大的域变换仍然具有挑战性。大域变换可能导致在原始领域和新领域之间没有共享高层次的概念。然而,像图像边缘这样的低层次概念可能仍然可以共享和有用。对于跨杜梅略略学来说,我们建议将深度神经神经网络的不同抽象水平整合为一个代表。我们提议跨多功能赫比安 Ensemble 少功能学习(CHF),这通过一个同深层神经网络网络打交道。在原始领域培训的深层神经网络中,微小的小型数据集网和分层药物网,在网络变换领域,CHEF 与更大规模的领域变现的化学元数据组相比,我们更具有竞争力。在高层次的域变现中,我们用高层次的CEF 建立新的化学方法。