Practical options for adopting recurrent neural network and its variants on remaining useful life prediction

Date

2021-07-12

Free to read from

Supervisor/s

Journal Title

Journal ISSN

Volume Title

Publisher

Springer

Department

Type

Article

ISSN

1000-9345

Format

Citation

Wang Y, Zhao Y, Addepalli S. (2021) Practical options for adopting recurrent neural network and its variants on remaining useful life prediction. Chinese Journal of Mechanical Engineering, Volume 34, July 2021, Article number 69

Abstract

The remaining useful life (RUL) of a system is generally predicted by utilising the data collected from the sensors that continuously monitor different indicators. Recently, different deep learning (DL) techniques have been used for RUL prediction and achieved great success. Because the data is often time-sequential, recurrent neural network (RNN) has attracted significant interests due to its efficiency in dealing with such data. This paper systematically reviews RNN and its variants for RUL prediction, with a specific focus on understanding how different components (e.g., types of optimisers and activation functions) or parameters (e.g., sequence length, neuron quantities) affect their performance. After that, a case study using the well-studied NASA’s C-MAPSS dataset is presented to quantitatively evaluate the influence of various state-of-the-art RNN structures on the RUL prediction performance. The result suggests that the variant methods usually perform better than the original RNN, and among which, Bi-directional Long Short-Term Memory generally has the best performance in terms of stability, precision and accuracy. Certain model structures may fail to produce valid RUL prediction result due to the gradient vanishing or gradient exploring problem if the parameters are not chosen appropriately. It is concluded that parameter tuning is a crucial step to achieve optimal prediction performance .

Description

Software Description

Software Language

Github

Keywords

Gated recurrent unit, Bi-directional long short-term memory, Long short-term memory, Recurrent neural network, Deep learning, Remaining useful life prediction

DOI

Rights

Attribution 4.0 International

Relationships

Relationships

Supplements

Funder/s