Reliable uncertainty quantification on RUL prediction is crucial for informative decision-making in predictive maintenance. In this context, we assess some of the latest developments in the field of uncertainty quantification for prognostics deep learning. This includes the state-of-the-art variational inference algorithms for Bayesian neural networks (BNN) as well as popular alternatives such as Monte Carlo Dropout (MCD), deep ensembles (DE) and heteroscedastic neural networks (HNN). All the inference techniques share the same inception deep learning architecture as a functional model. We performed hyperparameter search to optimize the main variational and learning parameters of the algorithms. The performance of the methods is evaluated on a subset of the large NASA NCMAPSS dataset for aircraft engines. The assessment includes RUL prediction accuracy, the quality of predictive uncertainty, and the possibility to break down the total predictive uncertainty into its aleatoric and epistemic parts. The results show no method clearly outperforms the others in all the situations. Although all methods are close in terms of accuracy, we find differences in the way they estimate uncertainty. Thus, DE and MCD generally provide more conservative predictive uncertainty than BNN. Surprisingly, HNN can achieve strong results without the added training complexity and extra parameters of the BNN. For tasks like active learning where a separation of epistemic and aleatoric uncertainty is required, radial BNN and MCD seem the best options.
翻译:RUL预测的可靠不确定性量化对于预测性维护方面的知情决策至关重要。 在这方面,我们评估了预测性深入学习的不确定性量化领域的一些最新动态,其中包括巴伊西亚神经网络(BNN)以及流行的替代方法,如Monte Carlo Dropout(MCD)、深层混凝土(DE)和超摄氏神经网络(HNN)等。所有推论技术都以功能模型的形式共享初始深层次学习结构。我们进行了超参数搜索,优化了算法的主要变异和学习参数。这些方法的性能在NASA NCCMAPSS飞机引擎大型数据集的一个子集上进行了评估。评估包括RUL预测准确性、预测性不确定性的质量,以及将预测性不确定性全部打破到其感官和感知性神经网络(HNNN)中的可能性。所有结果都没有明显超越所有其他情况。尽管所有方法在精确性方面都很接近,但我们在对主要变异性和学习参数方面都进行了精确性预测,因此,我们一般而言,对NPNF的不确定性的稳定性和不确定性的预测性分析结果似乎比NNF更精确。