Multitask Gaussian process (MTGP) is powerful for joint learning of multiple tasks with complicated correlation patterns. However, due to the assembling of additive independent latent functions, all current MTGPs including the salient linear model of coregionalization (LMC) and convolution frameworks cannot effectively represent and learn the hierarchical latent interactions between its latent functions. In this paper, we further investigate the interactions in LMC of MTGP and then propose a novel kernel representation of the hierarchical interactions, which ameliorates both the expressiveness and the interpretability of MTGP. Specifically, we express the interaction as a product of function interaction and coefficient interaction. The function interaction is modeled by using cross convolution of latent functions. The coefficient interaction between the LMCs is described as a cross coregionalization term. We validate that considering the interactions can promote knowledge transferring in MTGP and compare our approach with some state-of-the-art MTGPs on both synthetic- and real-world datasets.
翻译:多塔斯克高斯进程(MTGP)对于共同学习具有复杂关联模式的多重任务非常有力,然而,由于添加独立潜在潜在功能的集合,所有目前的MTGP(包括共同区域化和融合框架的显著线性模型)无法有效地代表并了解其潜在功能之间的等级潜在互动。在本文件中,我们进一步调查MTGP(MTGGP)在LMC中的相互作用,然后提出一个新的等级互动内核代表,这既能改善MTGP的表达性和可解释性。具体地说,我们表示这种互动是功能互动和系数互动的产物。功能互动是利用各种潜在功能的交叉交替来建模的。LMC之间的系数互动被描述为一个跨区域化术语。我们确认,考虑到这些互动可以促进在MTGP中的知识转让,并将我们的方法与合成和真实世界数据集上的一些最先进的MTGPs进行比较。