当前位置: X-MOL 学术Measurement › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Remaining useful life prediction for multi-sensor systems using a novel end-to-end deep-learning method
Measurement ( IF 5.6 ) Pub Date : 2021-06-15 , DOI: 10.1016/j.measurement.2021.109685
Yuyu Zhao , Yuxiao Wang

Remaining useful life (RUL) prediction plays a crucial role in ensuring reliability and safety of modern engineering systems. For complicated systems, the indirect manner of the conventional RUL prediction approaches restricts their universality and accuracy. The challenge to realize accurate RUL estimation consists in the direct exploration of the potential relationship between the RUL and the numerous data from multiple monitoring sensors. Motivated by this fact, a novel end-to-end RUL prediction method is proposed based on a deep learning model in this paper. The long short-term memory (LSTM) encoder-decoder is employed as the main frame of the model to deal with multivariate time series data. Then a two-stage attention mechanism is developed to realize adaptive extraction and evaluation of the input features and temporal correlation. On this basis, the RUL prediction is obtained by a multilayer perceptron. The proposed model can selectively focus on the critical information without any prior knowledge, which is of great significance to enhance the RUL prediction accuracy. The effectiveness and superiority of the proposed method is experimentally validated through a turbofan engine dataset and compared with the state-of-the-art methods.



中文翻译:

使用新的端到端深度学习方法预测多传感器系统的剩余使用寿命

剩余使用寿命 (RUL) 预测在确保现代工程系统的可靠性和安全性方面起着至关重要的作用。对于复杂系统,传统 RUL 预测方法的间接方式限制了它们的通用性和准确性。实现准确 RUL 估计的挑战在于直接探索 RUL 与来自多个监测传感器的大量数据之间的潜在关系。受此启发,本文提出了一种基于深度学习模型的新型端到端 RUL 预测方法。长短期记忆(LSTM)编码器-解码器被用作模型的主要框架来处理多元时间序列数据。然后开发了一个两阶段注意力机制来实现对输入特征和时间相关性的自适应提取和评估。在此基础上,通过多层感知器得到 RUL 预测。所提出的模型可以在没有任何先验知识的情况下选择性地关注关键信息,这对提高 RUL 预测精度具有重要意义。通过涡扇发动机数据集对所提出方法的有效性和优越性进行了实验验证,并与最先进的方法进行了比较。

更新日期:2021-06-23
down
wechat
bug