We prove that every online learnable class of functions of Littlestone dimension $d$ admits a learning algorithm with finite information complexity. Towards this end, we use the notion of a globally stable algorithm. Generally, the information complexity of such a globally stable algorithm is large yet finite, roughly exponential in $d$. We also show there is room for improvement; for a canonical online learnable class, indicator functions of affine subspaces of dimension $d$, the information complexity can be upper bounded logarithmically in $d$.
翻译:我们证明,每个Littlestone维度的在线可学习功能类别 $d$ 都承认了具有有限信息复杂性的学习算法。 为此,我们采用了全球稳定算法的概念。 一般来说,这种全球稳定算法的信息复杂性是巨大的,但以美元计的,大约是指数化的。 我们还表明,还有改进的余地;对于一个可在线可学习的康纳式类别,一个维度子空间的等同值函数,以美元计的,信息复杂性可以用美元计的上层对数。