Learning a new concept from one example is a superior function of the human brain and it is drawing attention in the field of machine learning as a one-shot learning task. In this paper, we propose one of the simplest methods for this task with a nonparametric weight imprinting, named Direct ONE-shot learning (DONE). DONE adds new classes to a pretrained deep neural network (DNN) classifier with neither training optimization nor pretrained-DNN modification. DONE is inspired by Hebbian theory and directly uses the neural activity input of the final dense layer obtained from data that belongs to the new additional class as the synaptic weight with a newly-provided-output neuron for the new class, transforming all statistical properties of the neural activity into those of synaptic weight by quantile normalization. DONE requires just one inference for learning a new concept and its procedure is simple, deterministic, not requiring parameter tuning and hyperparameters. DONE overcomes a severe problem of existing weight imprinting methods that DNN-dependently interfere with the classification of original-class images. The performance of DONE depends entirely on the pretrained DNN model used as a backbone model, and we confirmed that DONE with current well-trained backbone models perform at a decent accuracy.
翻译:从一个实例中学习一个新概念是人类大脑的优越功能,它正在机器学习领域引起人们的注意,作为一次性学习任务。在本文中,我们提出了这一任务的最简单方法之一,即非参数重量印记,名为直接一光学习(DONE)。DONE在经过预先训练的深神经网络分类(DNN)中增加了新课程,既无培训优化,也无预先训练的DNNN修改。DONE受Hebbian理论的启发,直接使用从属于新类别的数据中获取的最后密集层神经活动投入,作为新类的合成重量,即新类中新提供输出神经元的合成重量,将神经活动的所有统计属性转换成四光度正常化的合成重量。DONE只需要一种推断来学习新概念,其程序简单、确定性,不需要参数调整和超光度度。DONE克服了现有重度印刷方法的严重问题,DNNNE独立地干扰了原级图像的分类。DONE的性能表现完全取决于当前模型的正确性,我们所使用的基础状态。