Hamiltonian systems with multiple timescales arise in molecular dynamics, classical mechanics, and theoretical physics. Long-time numerical integration of such systems requires resolving fast dynamics with very small time steps, which incurs a high computational cost - especially in ensemble simulations for uncertainty quantification, sensitivity analysis, or varying initial conditions. We present a Deep Learning framework that learns the flow maps of Hamiltonian systems to accelerate long-time and ensemble simulations. Neural networks are trained, according to a chosen numerical scheme, either entirely without data to approximate flows over large time intervals or with data to learn flows in intervals far from the initial time. For the latter, we propose a Hamiltonian Monte Carlo-based data generator. The architecture consists of simple feedforward networks that incorporate truncated Taylor expansions of the flow map, with a neural network remainder capturing unresolved effects. Applied to benchmark non-integrable and non-canonical systems, the method achieves substantial speedups while preserving accuracy, enabling scalable simulation of complex Hamiltonian dynamics.
翻译:具有多时间尺度的哈密顿系统广泛存在于分子动力学、经典力学和理论物理学中。对此类系统进行长时间数值积分需要以极小时间步长解析快速动力学,这导致高昂的计算成本——尤其在不确定性量化、敏感性分析或变初始条件的系综模拟中更为显著。本文提出一种深度学习框架,通过学习哈密顿系统的流映射来加速长时间与系综模拟。神经网络根据选定的数值格式进行训练:既可完全无需数据以近似大时间间隔内的流,也可利用数据学习远离初始时刻区间的流。对于后者,我们提出一种基于哈密顿蒙特卡洛的数据生成器。该架构由简单的前馈网络构成,其中融合了流映射的截断泰勒展开式,并通过神经网络余项捕捉未解析效应。该方法应用于基准不可积与非正则系统时,在保持精度的同时实现了显著加速,为复杂哈密顿动力学的可扩展模拟提供了有效途径。