검색 상세

Learning Task-Relevant Representations with Query-Refined Attention for Remaining Useful Life Prediction

Query 정제 Attention 기반의 작업 특화 표현 학습을 통한 잔여 수명 예측

초록 (요약문)

The accurate prediction of remaining useful life (RUL) is a crucial task in prognostics and health management (PHM), enabling predictive maintenance and failure prevention in industrial systems. Recent deep learning-based approaches have demonstrated significant improvements in RUL estimation by leveraging time-series sensor data. In particular, attention mechanisms have been actively explored to enhance feature representation and improve prediction accuracy. However, existing attention-based models often struggle to explicitly capture degradation patterns that are directly relevant to the prediction target, and they are limited in their ability to model spatiotemporal dependencies across multivariate sensor data. To address these limitations, this study proposes a query-refined attention framework for learning task-specific representations in RUL prediction. The framework consists of two methods, each designed to refine the query. First, Supervised Query Refinement in Parallel Attention for RUL Prediction employs a parallel attention mechanism, where the query vectors are explicitly supervised using ground-truth RUL values. This design improves the relevance of attention by aligning query vectors with the prediction target, enabling more accurate analysis of correlations across temporal and sensor dimensions. It captures both temporal and sensor-based dependencies, significantly improving the model’s ability to track degradation trends. Second, Structured Query Refinement in Recurrent Attention for RUL Prediction introduces a recurrent attention mechanism that progressively refines the query representation through a three-stage structure comprising self-attention, contextual attention, and label-encoded attention. Unlike conventional mechanisms that apply uniform transformations to queries, keys, and values, this approach explicitly structures the refinement process to dynamically emphasize task-relevant features. The proposed query-refined attention framework, consisting of Supervised Query Refinement in Parallel Attention and Structured Query Refinement in Recurrent Attention, demonstrates superior performance over existing state-of-the-art models on benchmark datasets. The supervised query refinement method establishes a foundation by enhancing feature relevance through direct label alignment in a parallel attention structure, which is subsequently extended in the recurrent refinement approach to support dynamic query refinement and improved context sensitivity. Attention visualization further confirms the interpretability of the proposed framework by revealing its focus on critical degradation-related features. The findings of this research contribute to the advancement of deep learning-based RUL prediction by introducing a query-refined attention framework that systematically integrates spatiotemporal dependencies and task-specific query learning. This work provides a foundation for developing more interpretable and accurate PHM systems and contributes to the advancement of reliable and generalizable RUL estimation models applicable to real-world industrial settings.

more

목차

I. Introduction 1
1.1 Motivation 1
1.2 Problem Statement 4
1.3 Background 9
1.3.1 Time-Series Data and Sensor Measurements in Prognostics and Health Monitoring 9
1.3.2 Fundamentals of Remaining Useful Life Prediction 11
1.3.3 Attention Mechanism for Sequential Data: Concepts and
Principles 14
1.4 Overview and Contribution 15
1.5 Dissertation Outline 17
II. Supervised Query Refinement in Parallel Attention for RUL Prediction 19
2.1 Introduction 19
2.2 Related Works 21
2.2.1 Deep Learning Approaches 21
2.2.2 Attention-Based RUL Prediction Models 24
2.3 Methodology 28
2.3.1 Overall Architecture 28
2.3.2 Parallel Attention Network with Supervised Query Refinement 32
2.3.3 Bidirectional LSTM Network 35
2.3.4 Fusion Network 37
2.4 Experimental Results 38
2.4.1 Datasets and Data Processing 38
2.4.2 Evaluation Metrics 41
2.4.3 Implementation Details 42
2.4.4 Comparison with State-of-the-Art Methods 43
2.4.5 Ablation Study 46
2.4.5.1 Effectiveness of Each Path Network 46
2.4.5.2 Effectiveness of Supervised Query-Refined Attention Network 47
2.4.5.3 Robustness to Window Length Changes 48
2.4.5.4 Visualization of Attention Scores in Supervised Query-Refined Attention Network 49
III. Structured Query Refinement in Recurrent Attention for RUL Prediction 52
3.1 Introduction 52
3.2 Related Works 54
3.3 Methodology 58
3.3.1 Overall Architecture 58
3.3.2 Recurrent Attention Network with Structured Query Refinement 60
3.3.3 Self-Attention Module 61
3.3.4 Contextual Attention Module 62
3.3.5 Label-Encoded Attention Module 65
3.3.6 Structure of the Proposed Model 68
3.4 Experimental Results 69
3.4.1 Comparative Study of Evaluation Metrics 69
3.4.2 Comparative Study of Computational Complexity 71
3.4.3 Ablation Study 74
3.4.3.1 Effectiveness of Query Compared to Key and Value Refinement 74
3.4.3.2 Effectiveness of Supervising Only Query Compared to Query, Key, and Value 78
3.4.3.3 Effectiveness of Each Subnetwork 81
3.4.3.4 Effectiveness of Recurrent Attention Structure 83
3.4.3.5 Removing Query Supervision in Contextual Attention Module 84
3.4.3.6 Query Representation Map Obtained from the Recurrent Attention 86
3.4.3.7 Explainability of Attention Scores Obtained from the Recurrent Attention 87
IV. Conclusion and Future Work 95
4.1 Conclusion 95
4.2 Limitations 96
4.3 Future Work 97
Bibliography 99

more