This paper presents a construction of a proper and stable labelled sample compression scheme of size $O(\VCD^2)$ for any finite concept class, where $\VCD$ denotes the Vapnik-Chervonenkis Dimension. The construction is based on a well-known model of machine teaching, referred to as recursive teaching dimension. This substantially improves on the currently best known bound on the size of sample compression schemes (due to Moran and Yehudayoff), which is exponential in $\VCD$. The long-standing open question whether the smallest size of a sample compression scheme is in $O(\VCD)$ remains unresolved, but our results show that research on machine teaching is a promising avenue for the study of this open problem. As further evidence of the strong connections between machine teaching and sample compression, we prove that the model of no-clash teaching, introduced by Kirkpatrick et al., can be used to define a non-trivial lower bound on the size of stable sample compression schemes.
翻译:本文为任何有限概念类构建了适当和稳定的标有标签的标称压缩计划,规模为O(VCD2)$2美元,其中美元为Vapnik-Chervonenkis 维度值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值值。 。 。 。, 。 。 。 。 。 。 。 。 。 。 。 。 。 构建 长期未久 久 未决问题 未决问题 未决问题 问题 问题 问题,,, 问题,, 问题,,,,,, 问题 问题, 问题,,,, 问题, 问题,,,,,,,,,,, 问题 问题,, 问题 问题 问题 问题,, 问题 问题 问题,,,,,,,,, 问题,,,,,,, 问题,,, 问题,,,,,,