In recent years, Bayesian inference in large-scale inverse problems found in science, engineering and machine learning has gained significant attention. This paper examines the robustness of the Bayesian approach by analyzing the stability of posterior measures in relation to perturbations in the likelihood potential and the prior measure. We present new stability results using a family of integral probability metrics (divergences) akin to dual problems that arise in optimal transport. Our results stand out from previous works in three directions: (1) We construct new families of integral probability metrics that are adapted to the problem at hand; (2) These new metrics allow us to study both likelihood and prior perturbations in a convenient way; and (3) our analysis accommodates likelihood potentials that are only locally Lipschitz, making them applicable to a wide range of nonlinear inverse problems. Our theoretical findings are further reinforced through specific and novel examples where the approximation rates of posterior measures are obtained for different types of perturbations and provide a path towards the convergence analysis of recently adapted machine learning techniques for Bayesian inverse problems such as data-driven priors and neural network surrogates.
翻译:近年来,在科学、工程和机器学习中发现的大规模反问题中,贝耶斯人的推论引起了人们的极大注意,本文件通过分析贝耶斯人方法的稳健性,分析了与潜在可能性和先期性扰动有关的后继措施的稳定性;我们利用一个具有与最佳运输中出现的双重问题相似的整体概率度量(度量)的大家庭,提出了新的稳定性结果;我们从以往的著作中可以看出三个方向:(1) 我们建造了适合手头问题的综合概率度量值的新家庭;(2) 这些新的度量度使我们得以以方便的方式研究可能性和先前的扰动;(3) 我们的分析考虑到仅在当地Lipschitz采用的可能性,使之适用于广泛的非线性反问题;我们通过一些具体和新颖的例子,进一步强化了我们的理论结论,从中获得了不同类型扰动近似率的后继措施,并为最近为巴伊斯人反向问题(如数据驱动的前置和内向网络图)调整的机器学习技术的趋同分析提供了途径。</s>