Model calibration involves using experimental or field data to estimate the unknown parameters of a mathematical model. This task is complicated by discrepancy between the model and reality, and by possible bias in the data. We consider model calibration in the presence of both model discrepancy and measurement bias using multiple sources of data. Model discrepancy is often estimated using a Gaussian stochastic process (GaSP), but it has been observed in many studies that the calibrated mathematical model can be far from the reality. Here we show that modeling the discrepancy function via a GaSP often leads to an inconsistent estimation of the calibration parameters even if one has an infinite number of repeated experiments and infinite number of observations in a fixed input domain in each experiment. We introduce the scaled Gaussian stochastic process (S-GaSP) to model the discrepancy function. Unlike the GaSP, the S-GaSP utilizes a non-increasing scaling function which assigns more probability mass on the smaller $L_2$ loss between the mathematical model and reality, preventing the calibrated mathematical model from deviating too much from reality. We apply our technique to the calibration of a geophysical model of K\={\i}lauea Volcano, Hawai`i, using multiple radar satellite interferograms. We compare the use of models calibrated using multiple data sets simultaneously with results obtained using stacks (averages). We derive distributions for the maximum likelihood estimator and Bayesian inference, both implemented in the "RobustCalibration" package available on CRAN. Analysis of both simulated and real data confirm that our approach can identify the measurement bias and model discrepancy using multiple sources of data, and provide better estimates of model parameters.
翻译:模型校准需要使用实验或实地数据来估计数学模型的未知参数。 由于模型和现实之间的差异以及数据中可能存在的偏差, 这项任务由于模型和现实之间的差异而变得复杂。 我们考虑模型校准, 使用多种数据源, 在模型差异和测量偏差时, 模型差异往往使用高尔西亚的随机分析过程( GaSP) 来估计, 但在许多研究中观察到, 校准的数学模型与现实相比, 校准的数学模型可能离现实远得多。 我们在这里显示, 通过一个 GaSP 模型对校准参数进行模型往往导致校准参数的估算不一致, 即使一个人在每次实验中有一个固定输入域的重复实验和观测次数的无限差异。 我们采用尺度缩放的Gaussian 随机分析过程( S-GaSP) 来模拟差异功能。 S- GaSP 与 Gaus 不同的是, S- Gaster 使用一个不增加的缩放的缩放功能, 在数学模型和现实中证实校准的校准的数学模型中, 我们用高校准的校准的校准的校准的数值数据源。