The notion of concept has been studied for centuries, by philosophers, linguists, cognitive scientists, and researchers in artificial intelligence (Margolis & Laurence, 1999). There is a large literature on formal, mathematical models of concepts, including a whole sub-field of AI -- Formal Concept Analysis -- devoted to this topic (Ganter & Obiedkov, 2016). Recently, researchers in machine learning have begun to investigate how methods from representation learning can be used to induce concepts from raw perceptual data (Higgins, Sonnerat, et al., 2018). The goal of this report is to provide a formal account of concepts which is compatible with this latest work in deep learning. The main technical goal of this report is to show how techniques from representation learning can be married with a lattice-theoretic formulation of conceptual spaces. The mathematics of partial orders and lattices is a standard tool for modelling conceptual spaces (Ch.2, Mitchell (1997), Ganter and Obiedkov (2016)); however, there is no formal work that we are aware of which defines a conceptual lattice on top of a representation that is induced using unsupervised deep learning (Goodfellow et al., 2016). The advantages of partially-ordered lattice structures are that these provide natural mechanisms for use in concept discovery algorithms, through the meets and joins of the lattice.
翻译:数世纪以来,哲学家、语言学家、认知科学家和人工智能研究人员一直在研究概念概念概念(Margolis & Laurence,1999年,Margolis & Laurence),关于正式和数学概念模型的大量文献,包括用于这一专题的AI -- -- 正式概念分析 -- -- 正式概念分析的整个子领域(Ganter & Obiedkov,2016年)。最近,机器学习的研究人员开始研究如何利用代表学习方法从原始概念数据中产生概念(Higgins、Sonnerat等,2018年)。本报告的目标是提供一个正式的概念说明,与这一最新的深层学习工作相容。本报告的主要技术目标是说明如何将代表性学习技术与概念空间的拉丁拼接。部分命令和拉托语的数学是建模概念空间的标准工具(Ch.2,Mitchell(1997年),Ganter和Obiedkov(2016年);然而,我们并没有正式的工作,我们知道如何界定一个概念的顶端,即利用不超超强的深层学习工作。