One goal in Bayesian machine learning is to encode prior knowledge into prior distributions, to model data efficiently. We consider prior knowledge from systems of linear partial differential equations together with their boundary conditions. We construct multi-output Gaussian process priors with realizations in the solution set of such systems, in particular only such solutions can be represented by Gaussian process regression. The construction is fully algorithmic via Gr\"obner bases and it does not employ any approximation. It builds these priors combining two parametrizations via a pullback: the first parametrizes the solutions for the system of differential equations and the second parametrizes all functions adhering to the boundary conditions.
翻译:Bayesian 机器学习的一个目标是将先前的知识编码为先前的分布,以便有效地模拟数据。我们考虑从线性部分方程式系统及其边界条件中得出的先前知识。我们建立多产出高斯进程前期,并在这些系统的解决方案集中实现,特别是只有这些解决方案可以由Gaussian进程回归来代表。这种构建是通过Gr\"obner基地完全算法的,它不使用任何近似值。它通过拉回将两个对称组合结合起来:第一个对称公式系统的解决办法和第二个对称公式所有符合边界条件的功能。