This paper introduces a new metamodel-based knowledge representation that significantly improves autonomous learning and adaptation. While interest in hybrid machine learning / symbolic AI systems leveraging, for example, reasoning and knowledge graphs, is gaining popularity, we find there remains a need for both a clear definition of knowledge and a metamodel to guide the creation and manipulation of knowledge. Some of the benefits of the metamodel we introduce in this paper include a solution to the symbol grounding problem, cumulative learning, and federated learning. We have applied the metamodel to problems ranging from time series analysis, computer vision, and natural language understanding and have found that the metamodel enables a wide variety of learning mechanisms ranging from machine learning, to graph network analysis and learning by reasoning engines to interoperate in a highly synergistic way. Our metamodel-based projects have consistently exhibited unprecedented accuracy, performance, and ability to generalize. This paper is inspired by the state-of-the-art approaches to AGI, recent AGI-aspiring work, the granular computing community, as well as Alfred Korzybski's general semantics. One surprising consequence of the metamodel is that it not only enables a new level of autonomous learning and optimal functioning for machine intelligences, but may also shed light on a path to better understanding how to improve human cognition.
翻译:本文介绍了一个新的基于现代模式的知识代表,大大改进了自主学习和适应。虽然人们对混合机器学习/象征性的AI系统,例如运用推理和知识图表的兴趣越来越普遍,但我们发现仍然需要对知识有一个明确的定义,还需要一个指导知识创造和操纵的元模型。我们在本文件中介绍的元模型的一些好处包括:解决标志定位问题、累积学习和联合学习。我们已经将元模型应用于时间序列分析、计算机视觉和自然语言理解等各种问题,并发现元模型能够提供多种多样的学习机制,从机器学习到通过推理引擎绘制网络分析和学习,以高度协同的方式进行互动。我们基于元模型的项目始终表现出前所未有的准确性、性能和普及能力。本文的灵感来自对AGI的先进方法、最近AGI的进取性工作、颗粒计算界以及Alfred Korzybski的一般语义学。这个元模型的一个令人惊讶的后果是,它不仅能让机器更好地理解如何改进智能和智能的新水平,而且还能让机器更好地了解如何改进自主学习。