Recently, Neural Ordinary Differential Equations has emerged as a powerful framework for modeling physical simulations without explicitly defining the ODEs governing the system, but learning them via machine learning. However, the question: Can Bayesian learning frameworks be integrated with Neural ODEs to robustly quantify the uncertainty in the weights of a Neural ODE? remains unanswered. In an effort to address this question, we demonstrate the successful integration of Neural ODEs with two methods of Bayesian Inference: (a) The No-U-Turn MCMC sampler (NUTS) and (b) Stochastic Langevin Gradient Descent (SGLD). We test the performance of our Bayesian Neural ODE approach on classical physical systems, as well as on standard machine learning datasets like MNIST, using GPU acceleration. Finally, considering a simple example, we demonstrate the probabilistic identification of model specification in partially-described dynamical systems using universal ordinary differential equations. Together, this gives a scientific machine learning tool for probabilistic estimation of epistemic uncertainties.
翻译:最近,神经普通等同作为模拟物理模拟的强大框架,没有明确界定系统运行的代码,而是通过机器学习来学习。然而,问题:Can Bayesian学习框架与神经代码结合,以强有力地量化神经代码重量的不确定性?仍然没有得到解答。为了解决这一问题,我们展示了神经代码与巴伊西亚推理两种方法的成功结合:(a) 无U-Turn MCMC样板(NUTS)和(b) 随机兰埃文梯根源(Stochatic Langevin Gradientient Broom (SGLD)) 。我们用GPU来测试了我们贝伊纳神经代码方法在经典物理系统以及在标准机器学习数据集(如MNIST)上的性能。最后,我们以简单的例子为例,展示了在部分被描述的动态系统中使用通用的普通差异方程式对模型规格的概率性辨。我们共同提供了一种科学机器学习工具,用于对缩称不确定性进行稳性估计。