We introduce multi-head neural networks (MH-NNs) to physics-informed machine learning, which is a type of neural networks (NNs) with all nonlinear hidden layers as the body and multiple linear output layers as multi-head. Hence, we construct multi-head physics-informed neural networks (MH-PINNs) as a potent tool for multi-task learning (MTL), generative modeling, and few-shot learning for diverse problems in scientific machine learning (SciML). MH-PINNs connect multiple functions/tasks via a shared body as the basis functions as well as a shared distribution for the head. The former is accomplished by solving multiple tasks with MH-PINNs with each head independently corresponding to each task, while the latter by employing normalizing flows (NFs) for density estimate and generative modeling. To this end, our method is a two-stage method, and both stages can be tackled with standard deep learning tools of NNs, enabling easy implementation in practice. MH-PINNs can be used for various purposes, such as approximating stochastic processes, solving multiple tasks synergistically, providing informative prior knowledge for downstream few-shot learning tasks such as meta-learning and transfer learning, learning representative basis functions, and uncertainty quantification. We demonstrate the effectiveness of MH-PINNs in five benchmarks, investigating also the possibility of synergistic learning in regression analysis. We name the open-source code "Lernaean Hydra" (L-HYDRA), since this mythical creature possessed many heads for performing important multiple tasks, as in the proposed method.
翻译:我们引入多头神经网络(MH-NNS)用于物理知情机器学习,这是一种非线性隐蔽的神经网络(NNS),其类型包括所有非线性隐蔽的神经网络(NNS),作为机体和多线性输出层,作为多头。因此,我们建造多头物理知情神经网络(MH-PINNS),作为多任务学习(MTL)的有力工具,基因化建模,和微小的学习,解决科学机器学习(SciML)中的各种问题。 MH-PINS通过作为基础的共享机构连接多种功能/任务,以及头部的共享分布。 前者通过与每个头独立对应的MH-PINS(NNNS)解决多项任务,而后者则利用正常流(NFS)来进行密度估计和感化模型模型。 为此,我们的方法是一个两阶段,两个阶段都可以用标准的深入学习工具来解决,便于在实践中执行。 MH-PINNS可以用于各种目的,例如对ML的尾行前级学习方法进行精度学习,在前级学习基础中,在SBIL中,在S-CL进行多重学习前学习中,在前学习方法上,在前学习中进行多重方法上,在前学习中进行。