We propose a novel framework for uncertainty quantification via information bottleneck (IB-UQ) for scientific machine learning tasks, including deep neural network (DNN) regression and neural operator learning (DeepONet). Specifically, we incorporate the bottleneck by a confidence-aware encoder, which encodes inputs into latent representations according to the confidence of the input data belonging to the region where training data is located, and utilize a Gaussian decoder to predict means and variances of outputs conditional on representation variables. Furthermore, we propose a data augmentation based information bottleneck objective which can enhance the quantification quality of the extrapolation uncertainty, and the encoder and decoder can be both trained by minimizing a tractable variational bound of the objective. In comparison to uncertainty quantification (UQ) methods for scientific learning tasks that rely on Bayesian neural networks with Hamiltonian Monte Carlo posterior estimators, the model we propose is computationally efficient, particularly when dealing with large-scale data sets. The effectiveness of the IB-UQ model has been demonstrated through several representative examples, such as regression for discontinuous functions, real-world data set regression, learning nonlinear operators for partial differential equations, and a large-scale climate model. The experimental results indicate that the IB-UQ model can handle noisy data, generate robust predictions, and provide confident uncertainty evaluation for out-of-distribution data.
翻译:暂无翻译