Recent research in machine teaching has explored the instruction of any concept expressed in a universal language. In this compositional context, new experimental results have shown that there exist data teaching sets surprisingly shorter than the concept description itself. However, there exists a bound for those remarkable experimental findings through teaching size and concept complexity that we further explore here. As concepts are rarely taught in isolation we investigate the best configuration of concepts to teach a given set of concepts, where those that have been acquired first can be reused for the description of new ones. This new notion of conditional teaching size uncovers new insights, such as the interposition phenomenon: certain prior knowledge generates simpler compatible concepts that increase the teaching size of the concept that we want to teach. This does not happen for conditional Kolmogorov complexity. Furthermore, we provide an algorithm that constructs optimal curricula based on interposition avoidance. This paper presents a series of theoretical results, including their proofs, and some directions for future work. New research possibilities in curriculum teaching in compositional scenarios are now wide open to exploration.
翻译:最近对机器教学的研究探索了以通用语言表达的任何概念的教学。在这一构思背景下,新的实验结果显示,现有数据教学组比概念描述本身要短得多,令人惊讶。然而,我们在此进一步探讨的通过教学规模和概念复杂性得出的这些非凡实验发现是必然的。由于概念很少被孤立地教授,我们调查了教授一套特定概念的最佳概念结构,在这些概念中,首先获得的概念可以重新用于描述新的概念。这种有条件教学规模的新概念发现了新的洞察力,例如插座现象:某些先前的知识产生了更简单的兼容概念,增加了我们想要教授的概念的教学规模。这并非针对有条件的科尔莫戈罗夫复杂情况。此外,我们提供了一种基于避免相互定位而构建最佳课程的算法。本文介绍了一系列理论结果,包括证据,以及未来工作的一些方向。现在可以广泛探讨关于组成情景课程教学的新研究可能性。