Generalizing machine learning (ML) models for network traffic dynamics tends to be considered a lost cause. Hence for every new task, we design new models and train them on model-specific datasets closely mimicking the deployment environments. Yet, an ML architecture called_Transformer_ has enabled previously unimaginable generalization in other domains. Nowadays, one can download a model pre-trained on massive datasets and only fine-tune it for a specific task and context with comparatively little time and data. These fine-tuned models are now state-of-the-art for many benchmarks. We believe this progress could translate to networking and propose a Network Traffic Transformer (NTT), a transformer adapted to learn network dynamics from packet traces. Our initial results are promising: NTT seems able to generalize to new prediction tasks and environments. This study suggests there is still hope for generalization, though it calls for a lot of future research.
翻译:网络交通动态的集成机器学习模式(ML)往往被视为一个丢失的原因。因此,我们设计了新模型,并用模型特定的数据集来训练这些模型,以密切模仿部署环境。然而,一个名为_Transform_的ML结构在其他领域实现了先前无法想象的概括化。如今,我们可以下载一个关于大规模数据集的预培训模型,并且只能为特定的任务和背景对它进行微调,而时间和数据相对较少。这些经过微调的模型对于许多基准来说都是最先进的。我们认为,这一进展可以转化成网络,并提议一个网络交通变换器(NTT ), 这是一种改造器,能够从数据包痕迹中学习网络动态。我们的初步结果很有希望:NTT似乎能够概括新的预测任务和环境。本研究表明,仍然有希望进行概括化,尽管它需要大量的未来研究。