Remaining useful life prediction (RUL) is one of the key technologies of condition-based maintenance, which is important to maintain the reliability and safety of industrial equipments. Massive industrial measurement data has effectively improved the performance of the data-driven based RUL prediction method. While deep learning has achieved great success in RUL prediction, existing methods have difficulties in processing long sequences and extracting information from the sensor and time step aspects. In this paper, we propose Dual Aspect Self-attention based on Transformer (DAST), a novel deep RUL prediction method, which is an encoder-decoder structure purely based on self-attention without any RNN/CNN module. DAST consists of two encoders, which work in parallel to simultaneously extract features of different sensors and time steps. Solely based on self-attention, the DAST encoders are more effective in processing long data sequences, and are capable of adaptively learning to focus on more important parts of input. Moreover, the parallel feature extraction design avoids mutual influence of information from two aspects. Experiments on two widely used turbofan engines datasets show that our method significantly outperforms the state-of-the-art RUL prediction methods.
翻译:大量的工业测量数据有效地改善了数据驱动的RUL预测方法的性能。虽然深入学习在RUL的预测中取得了巨大成功,但现有方法在处理长序列和从传感器和时间步骤方面提取信息方面有困难。在本文件中,我们提议基于变压器(DAST)的双光自控,这是一种全新的深层RUL预测方法,是一种纯粹基于无RNN/CNN模块的自我注意的编码解码器结构。DAST由两个编码器组成,与同时提取不同传感器和时间步骤的特性同时工作。基于自我注意,DAST的编码器在处理长数据序列和时间步骤方面更为有效,并且能够适应性地学习以更注重投入的更重要部分。此外,平行地谱提取设计避免了信息在两个方面相互影响。对两种广泛使用的 RUNNN/CN的发动机数据预测方法进行了实验,显示我们采用的方法是显著的。