Neural networks often require large amounts of data to generalize and can be ill-suited for modeling small and noisy experimental datasets. Standard network architectures trained on scarce and noisy data will return predictions that violate the underlying physics. In this paper, we present methods for embedding even--odd symmetries and conservation laws in neural networks and propose novel extensions and use cases for physical constraint embedded neural networks. We design an even--odd decomposition architecture for disentangling a neural network parameterized function into its even and odd components and demonstrate that it can accurately infer symmetries without prior knowledge. We highlight the noise resilient properties of physical constraint embedded neural networks and demonstrate their utility as physics-informed noise regulators. Here we employed a conservation of energy constraint embedded network as a physics-informed noise regulator for a symbolic regression task. We showed that our approach returns a symbolic representation of the neural network parameterized function that aligns well with the underlying physics while outperforming a baseline symbolic regression approach.
翻译:神经网络往往需要大量的数据来进行概括化,并且可能不适合模拟小型和吵闹的实验数据集。在稀缺和吵闹数据方面受过培训的标准网络结构将返回违反基本物理学的预测。在本文中,我们介绍了将偶而有差异的对称法和保全法嵌入神经网络的方法,并提出了物理约束嵌入神经网络的新扩展和使用案例。我们设计了一个偶而有差异的分解结构,将神经网络参数的功能脱钩成其偶数和奇数的元件,并表明它能够准确推断出未经事先了解的对称。我们强调物理约束嵌入神经网络的噪音耐应变特性,并展示其作为物理学知情噪音调节器的效用。我们在这里使用一种节能约束嵌入网络作为物理学知情的噪音调节器进行象征性的回归任务。我们显示,我们的方法是象征性地代表神经网络参数的参数功能,该功能与基础物理学相匹配,同时优于一种基线的象征性回归方法。