We propose a framework for solving evolution equations within parametric function classes, especially ones that are specified by neural networks. We call this framework the minimal neural evolution (MNE) because it is motivated by the goal of seeking the smallest instantaneous change in the neural network parameters that is compatible with exact solution of the evolution equation at a set of evolving collocation points. Formally, the MNE is quite similar to the recently introduced Neural Galerkin framework, but a difference in perspective motivates an alternative sketching procedure that effectively reduces the linear systems solved within the integrator to a size that is interpretable as an effective rank of the evolving neural tangent kernel, while maintaining a smooth evolution equation for the neural network parameters. We focus specifically on the application of this framework to diffusion processes, where the score function allows us to define intuitive dynamics for the collocation points. These can in turn be propagated jointly with the neural network parameters using a high-order adaptive integrator. In particular, we demonstrate how the Ornstein-Uhlenbeck diffusion process can be used for the task of sampling from a probability distribution given a formula for the density but no training data. This framework extends naturally to allow for conditional sampling and marginalization, and we show how to systematically remove the sampling bias due to parametric approximation error. We validate the efficiency, systematic improvability, and scalability of our approach on illustrative examples in low and high spatial dimensions.
翻译:暂无翻译