Remaining useful life prediction (RUL) is one of the key technologies of condition-based maintenance, which is important to maintain the reliability and safety of industrial equipments. While deep learning has achieved great success in RUL prediction, existing methods have difficulties in processing long sequences and extracting information from the sensor and time step aspects. In this paper, we propose Dual Aspect Self-attention based on Transformer (DAST), a novel deep RUL prediction method. DAST consists of two encoders, which work in parallel to simultaneously extract features of different sensors and time steps. Solely based on self-attention, the DAST encoders are more effective in processing long data sequences, and are capable of adaptively learning to focus on more important parts of input. Moreover, the parallel feature extraction design avoids mutual influence of information from two aspects. Experimental results on two real turbofan engine datasets show that our method significantly outperforms state-of-the-art methods.
翻译:其余使用寿命预测(RUL)是基于条件的维护的关键技术之一,对于保持工业设备的可靠性和安全十分重要。虽然深入学习在RUL的预测中取得了巨大成功,但现有方法在从传感器和时间步骤方面处理长序列和提取信息方面有困难。在本文件中,我们提议基于一种新型的深层RUL的预测方法变异器(DAST)的双分光自控法。DAST由两个编码器组成,它们同时工作,同时提取不同传感器和时间步骤的特性。单以自我注意为基础,DAST编码器在处理长数据序列方面更为有效,并且能够适应性地学习以更重要的投入部分为重点。此外,平行特性提取设计避免了两个方面的信息的相互影响。两个真正的涡轮风发动机数据集的实验结果显示,我们的方法大大超越了最新的方法。