Concept-oriented deep learning (CODL) is a general approach to meet the future challenges for deep learning: (1) learning with little or no external supervision, (2) coping with test examples that come from a different distribution than the training examples, and (3) integrating deep learning with symbolic AI. In CODL, as in human learning, concept representations are learned based on concept exemplars. Contrastive self-supervised learning (CSSL) provides a promising approach to do so, since it: (1) uses data-driven associations, to get away from semantic labels, (2) supports incremental and continual learning, to get away from (large) fixed datasets, and (3) accommodates emergent objectives, to get away from fixed objectives (tasks). We discuss major aspects of concept representation learning using CSSL. These include dual-level concept representations, CSSL for feature representations, exemplar similarity measures and self-supervised relational reasoning, incremental and continual CSSL, and contrastive self-supervised concept (class) incremental learning. The discussion leverages recent findings from cognitive neural science and CSSL.
翻译:以概念为导向的深层次学习(CODL)是应对未来深层次学习挑战的一种总体方法:(1) 在很少或没有外部监督的情况下学习,(2) 应对与培训实例不同分布的测试实例,(3) 将深层次学习与象征性的AI相结合。在CODL中,如同在人类学习中一样,概念的表述是建立在概念外表基础上的。自导自导式学习(CSSL)提供了这样做的有希望的方法,因为它:(1) 使用数据驱动的协会,摆脱语义标签,(2) 支持渐进和持续学习,摆脱(大型)固定数据集,(3) 适应突发目标,摆脱固定目标(任务)。我们讨论使用CSSL进行的概念表述的主要方面,其中包括双重概念表述、特征表达的CSSL、特效相似措施和自我监督的关系推理、递增和持续CSSL以及对比性自我监督概念(类)渐进学习。讨论利用了认知神经科学和CSSL的最新研究结果。