The Bayesian Conjugate Gradient method (BayesCG) is a probabilistic generalization of the Conjugate Gradient method (CG) for solving linear systems with real symmetric positive definite coefficient matrices. Our CG-based implementation of BayesCG under a structure-exploiting prior distribution represents an 'uncertainty-aware' version of CG. Its output consists of CG iterates and posterior covariances that can be propagated to subsequent computations. The covariances have low-rank and are maintained in factored form. This allows easy generation of accurate samples to probe uncertainty in downstream computations. Numerical experiments confirm the effectiveness of the low-rank posterior covariances.
翻译:Bayesian Conjugate Gradient 方法(Bayesian Conjuge Gradient 方法)是用来用实正对称正确定系数矩阵解决线性系统的共振梯度法的概率一般法。 我们基于CG在结构上开发先前分布法下对BayesCG的基于BayesCG实施代表了CG的“不确定性-认知”版本。 它的输出由CG 迭代法和后代变量组成,可以传播到随后的计算中。 共变体级别低,以系数形式保持。 这使得生成精确样本能够轻松地探测下游计算中的不确定性。 数字实验证实了低级后代变量的有效性 。