We study multiclass online prediction where the learner can predict using a list of multiple labels (as opposed to just one label in the traditional setting). We characterize learnability in this model using the $b$-ary Littlestone dimension. This dimension is a variation of the classical Littlestone dimension with the difference that binary mistake trees are replaced with $(k+1)$-ary mistake trees, where $k$ is the number of labels in the list. In the agnostic setting, we explore different scenarios depending on whether the comparator class consists of single-labeled or multi-labeled functions and its tradeoff with the size of the lists the algorithm uses. We find that it is possible to achieve negative regret in some cases and provide a complete characterization of when this is possible. As part of our work, we adapt classical algorithms such as Littlestone's SOA and Rosenblatt's Perceptron to predict using lists of labels. We also establish combinatorial results for list-learnable classes, including an list online version of the Sauer-Shelah-Perles Lemma. We state our results within the framework of pattern classes -- a generalization of hypothesis classes which can represent adaptive hypotheses (i.e. functions with memory), and model data-dependent assumptions such as linear classification with margin.
翻译:我们研究多类别在线预测,其中学习者可以使用包含多个标签的列表进行预测(与传统设置中仅使用一个标签的设置相反)。我们使用 b元Littlestone维度来表征该模型中的可学性。该维度是经典Littlestone维度的变化,其中将二进制错误树替换为(k+1)-元错误树,其中k是列表中标签的数量。在不可知的设置中,我们根据比较器类是单标签函数还是多标签函数并与算法使用的列表大小之间的权衡,探讨不同的情况。我们发现在某些情况下可以实现负增益,并提供了实现负增益的完整特征描述。作为我们的工作的一部分,我们将经典算法(如Littlestone的SOA和Rosenblatt的感知器)进行调整,以使用标签列表进行预测。我们还为可列表学习类建立组合结果,包括Sauer-Shelah-Perles引理的列表在线版本。我们在模式类的框架内陈述了我们的结果 - 这是假设类的推广,可以表示自适应假设(即具有记忆的函数),并且可以对数据进行依赖性假设,例如带边际的线性分类。