In this paper we analyze the graph-based approach to semi-supervised learning under a manifold assumption. We adopt a Bayesian perspective and demonstrate that, for a suitable choice of prior constructed with sufficiently many unlabeled data, the posterior contracts around the truth at a rate that is minimax optimal up to a logarithmic factor. Our theory covers both regression and classification.
翻译:在本文中,我们分析了在多重假设下以图表为基础的半监督学习方法。我们采用了贝叶斯观点,并表明,为了适当选择具有足够多未贴标签数据的先前构建的数据,后方围绕真相进行契约,其比率小到最符合对数因素。我们的理论涵盖回归和分类。