We introduce a new family of particle evolution samplers suitable for constrained domains and non-Euclidean geometries. Stein Variational Mirror Descent and Mirrored Stein Variational Gradient Descent minimize the Kullback-Leibler (KL) divergence to constrained target distributions by evolving particles in a dual space defined by a mirror map. Stein Variational Natural Gradient exploits non-Euclidean geometry to more efficiently minimize the KL divergence to unconstrained targets. We derive these samplers from a new class of mirrored Stein operators and adaptive kernels developed in this work. We demonstrate that these new samplers yield accurate approximations to distributions on the simplex, deliver valid confidence intervals in post-selection inference, and converge more rapidly than prior methods in large-scale unconstrained posterior inference. Finally, we establish the convergence of our new procedures under verifiable conditions on the target distribution.
翻译:我们引入了适合受限域和非欧化地貌的粒子进化采样器新系列。 Stein Variational Mirrational Empress 和镜像 Stein Variation Gradient Empress 将Kullback-Leiber (KL) 差异最小化到由镜像地图定义的双空间中变化中的粒子限制目标分布。 Stein Variational Natural Gradient 利用非欧化物的几何方法更有效地将 KL 差异降至不受限制的目标。 我们从这项工作中开发的一个新类镜像Stein操作器和适应性内核中提取这些样本。 我们证明这些新的采样器能产生精确的近似于简单轴上分布的近似值,在选后误判中提供有效的信任间隔,并比以前更快地在大规模不受限制的远地点误判中汇集方法。 最后,我们在可核实的目标分布条件下将我们的新程序趋同于可核实的条件。