The article considers semi-supervised multitask learning on a Gaussian mixture model (GMM). Using methods from statistical physics, we compute the asymptotic Bayes risk of each task in the regime of large datasets in high dimension, from which we analyze the role of task similarity in learning and evaluate the performance gain when tasks are learned together rather than separately. In the supervised case, we derive a simple algorithm that attains the Bayes optimal performance.
翻译:文章考虑了半监督的多任务学习高斯混合模型(GMM ) 。 我们使用统计物理方法计算了高维大型数据集体系中每项任务的无症状贝ys风险,我们从中分析了任务相似性在学习中的作用,并评估了任务共同而不是分别学习时的绩效收益。 在所监督的案例中,我们得出了一个简单的算法,得出了贝斯最佳性能。</s>