We consider the task of sampling with respect to a log concave probability distribution. The potential of the target distribution is assumed to be composite, \textit{i.e.}, written as the sum of a smooth convex term, and a nonsmooth convex term possibly taking infinite values. The target distribution can be seen as a minimizer of the Kullback-Leibler divergence defined on the Wasserstein space (\textit{i.e.}, the space of probability measures). In the first part of this paper, we establish a strong duality result for this minimization problem. In the second part of this paper, we use the duality gap arising from the first part to study the complexity of the Proximal Stochastic Gradient Langevin Algorithm (PSGLA), which can be seen as a generalization of the Projected Langevin Algorithm. Our approach relies on viewing PSGLA as a primal dual algorithm and covers many cases where the target distribution is not fully supported. In particular, we show that if the potential is strongly convex, the complexity of PSGLA is $O(1/\varepsilon^2)$ in terms of the 2-Wasserstein distance. In contrast, the complexity of the Projected Langevin Algorithm is $O(1/\varepsilon^{12})$ in terms of total variation when the potential is convex.
翻译:我们考虑的是关于对正弦概率分布的取样任务。 在本文的第一部分, 我们假设目标分布的可能性是复合的,\ textit{ i. e.} 。 以平滑的 convex 术语的总和写成, 而非moots convex 术语可能包含无限值。 目标分布可以被视为瓦塞斯坦空间(\ textit{ i. e.}) 定义的 Kullback- Leiber12 差异的最小化。 我们的方法取决于将 PSGLA 视为一种原始的双重算法, 并涵盖目标分布未得到充分支持的许多情况。 在本文的第二部分, 我们使用第一部分产生的双性差距来研究Proximal Stophatistic Grainatic Grangevient Langeevin Algorithm (PSGLA) 的复杂性。 在 Vaxional AL LA LA LA 的复杂性中, 我们展示的是, 当目标分布不完全支持时, $_ x( 美元) 美元 的变异性为 AL 。