We propose probabilistic task modelling -- a generative probabilistic model for collections of tasks used in meta-learning. The proposed model combines variational auto-encoding and latent Dirichlet allocation to model each task as a mixture of Gaussian distribution in an embedding space. Such modelling provides an explicit representation of a task through its task-theme mixture. We present an efficient approximation inference technique based on variational inference method for empirical Bayes parameter estimation. We perform empirical evaluations to validate the task uncertainty and task distance produced by the proposed method through correlation diagrams of the prediction accuracy on testing tasks. We also carry out experiments of task selection in meta-learning to demonstrate how the task relatedness inferred from the proposed model help to facilitate meta-learning algorithms.
翻译:我们提议了概率任务建模 -- -- 一个收集元学习所用任务的遗传性概率模型。拟议模型将不同自动编码和潜在的dirichlet分配作为嵌入空间高山分布的混合体,作为每个任务的模型的混合体,作为嵌入空间内高山分布的混合体。这种建模通过其任务主题混合体,明确体现了一项任务。我们提出了一种基于经验性贝叶斯参数估算不同推断法的高效近似推理技术。我们进行了实证评估,通过测试任务预测准确性的对比图,验证拟议方法产生的任务不确定性和任务距离。我们还进行了元学习任务选择实验,以证明从拟议模型中推断的任务关联性如何有助于促进元学习算法。