Loss of Signal (LOS) represents a significant cost for operators of optical networks. By studying large sets of real-world Performance Monitoring (PM) data collected from six international optical networks, we find that it is possible to forecast LOS events with good precision 1-7 days before they occur, albeit at relatively low recall, with supervised machine learning (ML). Our study covers twelve facility types, including 100G lines and ETH10G clients. We show that the precision for a given network improves when training on multiple networks simultaneously relative to training on an individual network. Furthermore, we show that it is possible to forecast LOS from all facility types and all networks with a single model, whereas fine-tuning for a particular facility or network only brings modest improvements. Hence our ML models remain effective for optical networks previously unknown to the model, which makes them usable for commercial applications.
翻译:信号损失(LOS)是光学网络操作者的一大代价。通过研究从六个国际光学网络收集的大量实际世界性绩效监测数据,我们发现在远前1至7天能够对远地性能监测事件进行精确的预测,尽管回顾率相对较低,并有监督的机器学习(ML ) 。我们的研究涵盖12个设施类型,包括100G线和ETH10G客户。我们表明,当多个网络的培训与单个网络的培训同时进行时,特定网络的精确度会提高。此外,我们表明,有可能从所有设施类型和所有网络中以单一模型预测远地性能监测数据,而对特定设施或网络的微调只能带来微小的改进。因此,我们的光学网络模型对于原先不为模型所熟悉的光学网络依然有效,因此这些模型可用于商业应用。