This paper introduces a new neural-network-based approach, namely IN-context Differential Equation Encoder-Decoder (INDEED), to simultaneously learn operators from data and apply it to new questions during the inference stage, without any weight update. Existing methods are limited to using a neural network to approximate a specific equation solution or a specific operator, requiring retraining when switching to a new problem with different equations. By training a single neural network as an operator learner, we can not only get rid of retraining (even fine-tuning) the neural network for new problems, but also leverage the commonalities shared across operators so that only a few demos are needed when learning a new operator. Our numerical results show the neural network's capability as a few-shot operator learner for a diversified type of differential equation problems, including forward and inverse problems of ODEs and PDEs, and also show that it can generalize its learning capability to operators beyond the training distribution, even to an unseen type of operator.
翻译:在问题差分方程问题中的上下文算子学习
本文介绍了一种新的基于神经网络的方法,即INDEED(IN-context Differential Equation Encoder-Decoder),它同时可以从数据中学习算子,并在推理阶段应用于新问题,而无需进行任何权重更新。现有方法仅限于使用神经网络来近似特定的方程解或特定算子,当切换到具有不同方程的新问题时需要重新训练。通过训练一个单一的神经网络作为算子学习器,我们不仅可以摆脱为新问题重新训练(甚至微调)神经网络的限制,而且还可以利用共享的算子特性,这样在学习新算子时,只需要使用少量演示即可。我们的数值结果显示了神经网络作为少量样本算子学习器在各种不同类型的差分方程问题上具有优越的性能,包括 ODEs 和 PDEs 的前向和反向问题,还表明它可以将其学习能力推广到训练分布之外的算子,甚至是看不见的算子类型。