Constraints are a natural choice for prior information in Bayesian inference. In various applications, the parameters of interest lie on the boundary of the constraint set. In this paper, we use a method that implicitly defines a constrained prior such that the posterior assigns positive probability to the boundary of the constraint set. We show that by projecting posterior mass onto the constraint set, we obtain a new posterior with a rich probabilistic structure on the boundary of that set. If the original posterior is a Gaussian, then such a projection can be done efficiently. We apply the method to Bayesian linear inverse problems, in which case samples can be obtained by repeatedly solving constrained least squares problems, similar to a MAP estimate, but with perturbations in the data. When combined into a Bayesian hierarchical model and the constraint set is a polyhedral cone, we can derive a Gibbs sampler to efficiently sample from the hierarchical model. To show the effect of projecting the posterior, we applied the method to deblurring and computed tomography examples.
翻译:在 Bayesian 推论中, 对先前信息是一种自然选择。 在各种应用中, 利息参数位于设定约束的边界上。 在本文中, 我们使用一种方法, 隐含地定义了先前限制的概率, 使后继者将正概率分配到设定约束的边界上。 我们显示, 通过在设定限制的边界上投射后质量, 我们获得了一个新的后继者, 并在设定的边界上拥有丰富的概率结构。 如果原始后继者是高斯人, 那么这样的预测可以有效完成 。 我们对 Bayesian 线性反向问题应用了这种方法, 在这样的情况下, 样本可以通过反复解决限制的最小方块问题获得, 类似于 MAP 估计, 但是在数据中带有扰动性 。 当将巴耶斯 等级模型和约束性组合体组合在一起时, 我们可以从等级模型中提取一个 Gibs 取样器到高效的样本 。 为了显示投影效果, 我们应用了方法来分解和计算图像示例 。