Purely data-driven deep neural networks (DNNs) applied to physical engineering systems can infer relations that violate physics laws, thus leading to unexpected consequences. To address this challenge, we propose a physics-model-based DNN framework, called Phy-Taylor, that accelerates learning compliant representations with physical knowledge. The Phy-Taylor framework makes two key contributions; it introduces a new architectural Physics-compatible neural network (PhN), and features a novel compliance mechanism, we call {\em Physics-guided Neural Network Editing\}. The PhN aims to directly capture nonlinearities inspired by physical quantities, such as kinetic energy, potential energy, electrical power, and aerodynamic drag force. To do so, the PhN augments neural network layers with two key components: (i) monomials of Taylor series expansion of nonlinear functions capturing physical knowledge, and (ii) a suppressor for mitigating the influence of noise. The neural-network editing mechanism further modifies network links and activation functions consistently with physical knowledge. As an extension, we also propose a self-correcting Phy-Taylor framework that introduces two additional capabilities: (i) physics-model-based safety relationship learning, and (ii) automatic output correction when violations of safety occur. Through experiments, we show that (by expressing hard-to-learn nonlinearities directly and by constraining dependencies) Phy-Taylor features considerably fewer parameters, and a remarkably accelerated training process, while offering enhanced model robustness and accuracy.
翻译:对物理工程系统应用的纯数据驱动深神经网络(DNNs)可以推断违反物理法律的关系,从而导致意外的后果。为了应对这一挑战,我们提议了一个基于物理模型的DNN框架,称为Phy-Taylor,以加快学习符合物理知识的表达方式。Phy-Taylor框架做出了两项关键贡献;它引入了一个新的建筑物理兼容的深神经网络(PhN),并具有一个新的合规机制,我们称其为“物理引导神经网络编辑”。PhN旨在直接捕捉由物理数量引发的非线性,如快速能源、潜在能源、电力和空气动力驱动力驱动力驱动力等。为此,PhN将增加神经网络层,其中有两个关键组成部分是:(一) 泰勒系列非线性功能的单一扩展,以捕捉物理知识,以及(二) 抑制模式噪音的影响。神经网络编辑机制进一步修改网络的连接和激活功能与物理知识一致。作为扩展,我们还提议在自我修正、自我修正和自我修正时直接进行自我修正、自我修正、自我修正、自我修正和自我修正的机能-修正的实验。