When maximum likelihood estimation is infeasible, one often turns to score matching, contrastive divergence, or minimum probability flow to obtain tractable parameter estimates. We provide a unifying perspective of these techniques as minimum Stein discrepancy estimators, and use this lens to design new diffusion kernel Stein discrepancy (DKSD) and diffusion score matching (DSM) estimators with complementary strengths. We establish the consistency, asymptotic normality, and robustness of DKSD and DSM estimators, then derive stochastic Riemannian gradient descent algorithms for their efficient optimisation. The main strength of our methodology is its flexibility, which allows us to design estimators with desirable properties for specific models at hand by carefully selecting a Stein discrepancy. We illustrate this advantage for several challenging problems for score matching, such as non-smooth, heavy-tailed or light-tailed densities.
翻译:当最大可能性估算不可行时,人们往往转而得分匹配、对比差异或最小概率流,以获得可移植参数估算。我们提供了这些技术的统一视角,作为最低限度的斯坦因差异估计器,并用这个镜头设计新的扩散核心斯坦因差异(DKSD)和分布得分比匹配(DSM),具有互补优势。我们建立了DKSD和DSM测算器的一致性、无症状的正常性、稳健性,然后为它们高效的优化而得出随机的里埃曼梯位下行算法。我们方法的主要优势在于它的灵活性,它使我们能够通过仔细选择石质差异(DKSD)和分布得分匹配(DSM),从而设计对手头具体模型具有理想属性的估测器。我们用这个优势来说明得分比对几个挑战性的问题,比如非毛、重尾或轻尾细细的密度。