Deep kernel learning is a promising combination of deep neural networks and nonparametric function learning. However, as a data driven approach, the performance of deep kernel learning can still be restricted by scarce or insufficient data, especially in extrapolation tasks. To address these limitations, we propose Physics Informed Deep Kernel Learning (PI-DKL) that exploits physics knowledge represented by differential equations with latent sources. Specifically, we use the posterior function sample of the Gaussian process as the surrogate for the solution of the differential equation, and construct a generative component to integrate the equation in a principled Bayesian hybrid framework. For efficient and effective inference, we marginalize out the latent variables in the joint probability and derive a collapsed model evidence lower bound (ELBO), based on which we develop a stochastic model estimation algorithm. Our ELBO can be viewed as a nice, interpretable posterior regularization objective. On synthetic datasets and real-world applications, we show the advantage of our approach in both prediction accuracy and uncertainty quantification.
翻译:深内核学习是深神经网络和非对称功能学习的一种有希望的组合,但作为一种数据驱动的方法,深内核学习的绩效仍可能受到稀缺或不充分数据的限制,特别是在外推任务方面。为解决这些局限性,我们提议利用物理知情深内核学习(PI-DKL),利用有潜源差异方程式所代表的物理知识。具体地说,我们使用高山进程的后方函数样本作为差异方程式解决办法的代言人,并建立一个基因化组成部分,将方程式纳入有原则的巴伊西亚混合框架。为了高效和有效地推断,我们在联合概率中将潜在变量排挤出边缘,并得出一个崩溃的模型证据下限(ELBO),在此基础上,我们开发了一种随机模型估算算法。我们的ELBO可被视为一个良好、可解释的后方程式规范目标。关于合成数据集和实际应用,我们展示了我们在预测准确性和不确定性量化两方面的方法的优势。