In this paper, we propose a deterministic variational inference approach and generate low-discrepancy points by minimizing the kernel discrepancy, also known as the Maximum Mean Discrepancy or MMD. Based on the general energetic variational inference framework by Wang et. al. (2021), minimizing the kernel discrepancy is transformed to solving a dynamic ODE system via the explicit Euler scheme. We name the resulting algorithm EVI-MMD and demonstrate it through examples in which the target distribution is fully specified, partially specified up to the normalizing constant, and empirically known in the form of training data. Its performances are satisfactory compared to alternative methods in the applications of distribution approximation, numerical integration, and generative learning. The EVI-MMD algorithm overcomes the bottleneck of the existing MMD-descent algorithms, which are mostly applicable to two-sample problems. Algorithms with more sophisticated structures and potential advantages can be developed under the EVI framework.
翻译:在本文中,我们提出一种确定性变异推断法,通过尽量减少内核差异(又称最大中位差异或MMD)来产生低差异点。基于Wang等人(2021年)的总体能量变异推断框架(2021年),将内核差异最小化转化为通过明确的 Euler 方案解决动态的 ODE 系统。我们列出了由此产生的EVI-MD 算法,并通过一些例子来证明它,在这些例子中,目标分布得到了完全确定,部分具体到不断标准化的程度,并以培训数据的形式实际知道。与分布近似、数字整合和基因学习应用的替代方法相比,其性能令人满意。EVI-MD算法克服了现有MD-白种算法的瓶颈,后者大多适用于两种模范问题。在EVI框架内可以开发结构更为复杂且具有潜在优势的等级。