Fast and stable fluid simulations are an essential prerequisite for applications ranging from computer-generated imagery to computer-aided design in research and development. However, solving the partial differential equations of incompressible fluids is a challenging task and traditional numerical approximation schemes come at high computational costs. Recent deep learning based approaches promise vast speed-ups but do not generalize to new fluid domains, require fluid simulation data for training, or rely on complex pipelines that outsource major parts of the fluid simulation to traditional methods. In this work, we propose a novel physics-constrained training approach that generalizes to new fluid domains, requires no fluid simulation data, and allows convolutional neural networks to map a fluid state from time-point t to a subsequent state at time t + dt in a single forward pass. This simplifies the pipeline to train and evaluate neural fluid models. After training, the framework yields models that are capable of fast fluid simulations and can handle various fluid phenomena including the Magnus effect and Karman vortex streets. We present an interactive real-time demo to show the speed and generalization capabilities of our trained models. Moreover, the trained neural networks are efficient differentiable fluid solvers as they offer a differentiable update step to advance the fluid simulation in time. We exploit this fact in a proof-of-concept optimal control experiment. Our models significantly outperform a recent differentiable fluid solver in terms of computational speed and accuracy.
翻译:快速和稳定的流体模拟是应用从计算机生成图像到计算机辅助研发设计等一系列应用的基本先决条件。然而,解决不压缩液体的局部差异方程式是一项艰巨的任务,传统的数字近似办法也具有很高的计算成本。最近深层次的学习方法可以带来巨大的加速增速,但不能推广到新的流体域,需要流体模拟数据用于培训,或者依赖将流体模拟的主要部分外包到传统方法的复杂管道。在这项工作中,我们提出了一个新的物理限制培训方法,将它推广到新的流体域,不需要流体模拟数据,并允许神经神经网络将一个流动状态从时间点t到时间t+dt的后期状态映射为高计算成本。这简化了用于培训和评价神经流体模型的管道。经过培训后,框架生成了能够将流体模拟主要部分外包给流体现象的模型,包括磁力效应和Karman vortex 街道。我们提供了一个互动的实时演示,以显示我们所训练的流体计算模型的速度和一般化能力。此外,我们所训练的流体模型将大量利用的流体模型的流体更新。