Recently, Daniely and Granot [arXiv:1910.05697] introduced a new notion of complexity called Approximate Description Length (ADL). They used it to derive novel generalization bounds for neural networks, that despite substantial work, were out of reach for more classical techniques such as discretization, Covering Numbers and Rademacher Complexity. In this paper we explore how ADL relates to classical notions of function complexity such as Covering Numbers and VC Dimension. We find that for functions whose range is the reals, ADL is essentially equivalent to these classical complexity measures. However, this equivalence breaks for functions with high dimensional range.
翻译:最近,Daniely和Granot[arXiv:19 105/05697]引入了一个新的复杂概念,称为“近似描述长度”(Appear Dream Trong)(ADL)。他们利用这个概念为神经网络得出新的概括性界限,尽管做了大量工作,但对于离散、覆盖数字和 Rademacher 复杂程度等较古典的技术来说却遥不可及。在本文中,我们探讨了ADL与覆盖数字和VC维度等功能复杂程度等传统概念的关系。我们发现,对于真实值的功能来说,ADL基本上与这些典型的复杂程度措施相当。然而,高维值功能的等同性中断。