Most expressive variational families -- such as normalizing flows -- lack practical convergence guarantees, as their theoretical assurances typically hold only at the intractable global optimum. In this work, we present a general recipe for constructing tuning-free, asymptotically exact variational flows on arbitrary state spaces from involutive MCMC kernels. The core methodological component is a novel representation of general involutive MCMC kernels as invertible, measurepreserving iterated random function systems, which act as the flow maps of our variational flows. This leads to three new variational families with provable total variation convergence. Our framework resolves key practical limitations of existing variational families with similar guarantees (e.g., MixFlows), while requiring substantially weaker theoretical assumptions. Finally, we demonstrate the competitive performance of our flows across tasks including posterior approximation, Monte Carlo estimates, and normalization constant estimation, outperforming or matching No-U-Turn sampler (NUTS) and black-box normalizing flows.
翻译:大多数表达能力强的变分族(如归一化流)缺乏实用的收敛保证,因为其理论保证通常仅在难以处理的全局最优解处成立。在本工作中,我们提出了一种通用方法,用于从对合MCMC核在任意状态空间上构建无需调参、渐近精确的变分流。核心方法学贡献是将一般的对合MCMC核新颖地表示为可逆、保测度的迭代随机函数系统,该系统充当了我们变分流的流映射。这产生了三个具有可证明全变差收敛性的新型变分族。我们的框架解决了现有具有类似保证的变分族(例如MixFlows)的关键实际限制,同时所需的理论假设显著更弱。最后,我们在包括后验近似、蒙特卡洛估计和归一化常数估计等任务中展示了我们变分流的竞争性能,其表现优于或匹配No-U-Turn采样器(NUTS)和黑盒归一化流。