Neural Ordinary Differential Equations model dynamical systems with ODEs learned by neural networks. However, ODEs are fundamentally inadequate to model systems with long-range dependencies or discontinuities, which are common in engineering and biological systems. Broader classes of differential equations (DE) have been proposed as remedies, including delay differential equations and integro-differential equations. Furthermore, Neural ODE suffers from numerical instability when modelling stiff ODEs and ODEs with piecewise forcing functions. In this work, we propose Neural Laplace, a unified framework for learning diverse classes of DEs including all the aforementioned ones. Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials. To make learning more efficient, we use the geometrical stereographic map of a Riemann sphere to induce more smoothness in the Laplace domain. In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs, including the ones with complex history dependency and abrupt changes.
翻译:神经网络所学的分子分异模型动态系统,与神经网络所学的分子分异。然而,分子分解根本不足以模拟具有远距离依赖性或不连续性的系统,而这种系统在工程和生物系统中是常见的。提出了较广泛的差异方程(DE)作为补救措施,包括延迟差异方程和内分异方程。此外,神经分子分解在模拟硬性硬性极和带有片断强制功能的分子分解体时,具有数字不稳定性。在这项工作中,我们提出了神经拉帕特(Neural Laplace),这是一个学习包括所有上述在内的各类DE类别的统一框架。我们没有在时间域模拟这些动态,而是在拉贝特域进行模型,这里的历史-相互依存性和不连续性可以代表复杂指数的对比。为了提高学习效率,我们使用里曼域的几何地形图来促成拉普尔域更平稳。在实验中,神经拉帕特(Neural Laplace)显示模型的优异性表现和外向不同类别的DE的轨迹,包括复杂的历史依赖和偏差。