When constructing a Bayesian Machine Learning model, we might be faced with multiple different prior distributions and thus are required to properly consider them in a sensible manner in our model. While this situation is reasonably well explored for classical Bayesian Statistics, it appears useful to develop a corresponding method for complex Machine Learning problems. Given their underlying Bayesian framework and their widespread popularity, Gaussian Processes are a good candidate to tackle this task. We therefore extend the idea of Mixture models for Gaussian Process regression in order to work with multiple prior beliefs at once - both a analytical regression formula and a Sparse Variational approach are considered. In addition, we consider the usage of our approach to additionally account for the problem of prior misspecification in functional regression problems.
翻译:在建立巴伊西亚机器学习模式时,我们可能面临多种不同的先前分配模式,因此需要在模型中以合理的方式适当考虑这些分配模式。虽然古典巴伊西亚统计对这种情况进行了合理的探讨,但为复杂的机器学习问题制定相应的方法似乎是有益的。鉴于其基础巴伊西亚框架及其广泛流行,高斯进程是完成这项任务的良好候选方。因此,我们扩展了高斯进程回归的混合模式的概念,以便同时与多种先前的信念合作,既考虑分析回归公式,又考虑粗略的变异法。此外,我们考虑使用我们的方法来进一步说明功能回归问题先前的错误区分问题。