We study multiclass online prediction where the learner can predict using a list of multiple labels (as opposed to just one label in the traditional setting). We characterize learnability in this model using the $b$-ary Littlestone dimension. This dimension is a variation of the classical Littlestone dimension with the difference that binary mistake trees are replaced with $(k+1)$-ary mistake trees, where $k$ is the number of labels in the list. In the agnostic setting, we explore different scenarios depending on whether the comparator class consists of single-labeled or multi-labeled functions and its tradeoff with the size of the lists the algorithm uses. We find that it is possible to achieve negative regret in some cases and provide a complete characterization of when this is possible. As part of our work, we adapt classical algorithms such as Littlestone's SOA and Rosenblatt's Perceptron to predict using lists of labels. We also establish combinatorial results for list-learnable classes, including an list online version of the Sauer-Shelah-Perles Lemma. We state our results within the framework of pattern classes -- a generalization of hypothesis classes which can represent adaptive hypotheses (i.e. functions with memory), and model data-dependent assumptions such as linear classification with margin.
翻译:本文探讨了多类在线预测的问题,其中学习者可以用多个标签的列表来进行预测(相对于传统设置中仅使用一个标签)。我们使用 $b$ 元小石头维数来表征此模型中的可学习性。该维数是经典小石头维的一个变化,区别在于二进制错误树被 $(k+1)$ 进制错误树所代替,其中 $k$ 是列表中标签的数量。在不确定性设置中,我们探讨了不同的场景,具体取决于比较器类是否由单标签或多标签函数组成,以及其与算法使用的列表大小之间的权衡。我们发现在某些情况下能够实现负的遗憾,并对此进行了完整的刻画。作为我们工作的一部分,我们将 Littlestone 的 SOA 和 Rosenblatt 的感知器等经典算法改进为利用标签列表进行预测。我们还为可列表学习类建立了组合结果,包括列表在线版本的 Sauer-Shelah-Perles 引理。我们将我们的结果陈述在模式类的框架内——这是一个假设类的推广,可以表示自适应假设(即具有记忆的函数),以及模拟数据依赖性假设,如带有边缘的线性分类。