Stein discrepancies have emerged as a powerful statistical tool, being applied to fundamental statistical problems including parameter inference, goodness-of-fit testing, and sampling. The canonical Stein discrepancies require the derivatives of a statistical model to be computed, and in return provide theoretical guarantees of convergence detection and control. However, for complex statistical models, the stable numerical computation of derivatives can require bespoke algorithmic development and render Stein discrepancies impractical. This paper focuses on posterior approximation using Stein discrepancies, and introduces a collection of non-canonical Stein discrepancies that are gradient free, meaning that derivatives of the statistical model are not required. Sufficient conditions for convergence detection and control are established, and applications to sampling and variational inference are presented.
翻译:Stein差异已成为一个强大的统计工具,用于基本统计问题,包括参数推论、良好健康测试和抽样; Canonic Stein差异要求计算统计模型的衍生物,反过来又为趋同检测和控制提供理论保证;然而,对于复杂的统计模型而言,对衍生物的稳定数字计算可能需要言语算法发展,使 Stein差异变得不切实际。本文侧重于使用 Stein差异的后向近似,并介绍了一系列无梯度的非Cancial Stein差异,这意味着不需要统计模型的衍生物;为趋同检测和控制规定了充分的条件,并介绍了采样和变异推断的应用。