Sampling from log-concave distributions is a central problem in statistics and machine learning. Prior work establishes theoretical guarantees for Langevin Monte Carlo algorithm based on overdamped and underdamped Langevin dynamics and, more recently, some third-order variants. In this paper, we introduce a new sampling algorithm built on a general $K$th-order Langevin dynamics, extending beyond second- and third-order methods. To discretize the $K$th-order dynamics, we approximate the drift induced by the potential via Lagrange interpolation and refine the node values at the interpolation points using Picard-iteration corrections, yielding a flexible scheme that fully utilizes the acceleration of higher-order Langevin dynamics. For targets with smooth, strongly log-concave densities, we prove dimension-dependent convergence in Wasserstein distance: the sampler achieves $\varepsilon$-accuracy within $\widetilde O(d^{\frac{K-1}{2K-3}}\varepsilon^{-\frac{2}{2K-3}})$ gradient evaluations for $K \ge 3$. To our best knowledge, this is the first sampling algorithm achieving such query complexity. The rate improves with the order $K$ increases, yielding better rates than existing first to third-order approaches.
翻译:暂无翻译