Machine learning is gaining growing momentum in various recent models for the dynamic analysis of information flows in data communications networks. These preliminary models often rely on off-the-shelf learning models to predict from historical statistics while disregarding the physics governing the generating behaviors of these flows. This paper instead introduces Flow Neural Network (FlowNN) to improve the feature representation with learned physical bias. This is implemented by an induction layer, working upon the embedding layer, to impose the physics connected data correlations, and a self-supervised learning strategy with stop-gradient to make the learned physics universal. For the short-timescale network prediction tasks, FlowNN achieves 17% - 71% of loss decrease than the state-of-the-art baselines on both synthetic and real-world networking datasets, which shows the strength of this new approach. Code will be made available.
翻译:在数据通信网络信息流动动态分析的各种最新模型中,机器学习的势头正在增加。这些初步模型往往依靠现成的学习模型从历史统计数据中预测历史统计数据,而忽略了这些流动生成行为的物理原理。本文介绍了流动神经网络(FlowNN),用学习的物理偏差来改进特征的表达方式。这是通过在嵌入层开展工作的感应层实施的,以强制建立物理相关数据关联关系,以及采用自我监督的学习战略,通过停止升级使学习的物理学普及。对于短期的网络预测任务,FLONN在合成和现实世界网络数据集上实现了17%至71%的损失下降率,这表明了这一新方法的力度。将提供代码。