Computer Integrated Manufacturing System ›› 2025, Vol. 31 ›› Issue (1): 290-305.DOI: 10.13196/j.cims.2022.0476

Previous Articles     Next Articles

Remaining useful life prediction of aero-engine based on knowledge distillation compression hybrid model

ZHANG Xiaoyan,LIU Yuefeng+   

  1. School of Information Engineering,Inner Mongolia University of Science and Technology
  • Online:2025-01-31 Published:2025-02-11
  • Supported by:
    Project supported by the Inner Mongolia Discipline Inspection and Supervision Big Data Laboratory Open Project Fund,China(No.IMDBD2020022).

基于知识蒸馏压缩混合模型的航空发动机剩余寿命预测研究

张小燕,刘月峰+   

  1. 内蒙古科技大学信息工程学院
  • 作者简介:
    张小燕(1997-),女,内蒙古呼和浩特人,硕士研究生,研究方向:深度学习、航空发动机剩余寿命预测等,E-mail:1213185642@qq.com;

    +刘月峰(1977-),男,内蒙古包头人,副教授,博士,研究方向:深度学习、机器学习、知识图谱、图像处理、大数据分析与应用等,通讯作者,E-mail:liuyuefeng01035@163.com。
  • 基金资助:
    内蒙古纪检监察大数据实验室开放课题基金资助项目(IMDBD2020022)。

Abstract: Aero-engine Remaining Useful Life (RUL) prediction is essential for improving the safety of aircraft operation systems and reducing maintenance costs.In response to the computing power and limited memory of edge equipment in actual applications,it is urgent to ensure that the model RUL predicts accuracy and reduce the size of the model.Therefore,a complex hybrid model used two-fold knowledge distillation compression was proposed to predict the RUL of aero-engine.The knowledge distillation was used for knowledge transfer between disparate architecture networks.Teacher model included complex networks such as Convolutional Neural Networks (CNN) and Bidirectional Long Short Term Memory (Bi-LSTM).Among them,CNN could extract data spatial features,while Bi-LSTM could learn bidirectional long-term dependencies of data.The student model used a relatively simple Multi-scale CNN(MS-CNN),used different convolution scales to extract data information separately to ensure the integrity of feature learning.The knowledge distillation was used to transfer knowledge between identical architecture networks,both student networks and teacher networks used MS-CNN.Finally,the student network obtained after two knowledge distillations had performed RUL prediction experiments on the C-MAPSS dataset.The results showed that the distillation student model had a certain improvement in the prediction accuracy of the four datasets.Compared with the original MS-CNN in the FD001 dataset,the distilled student model reduced the error value by 5.9% and was also 6.7% lower than the Bi-LSTM error value.It was proved that the proposed method had certain competitiveness in the field of aero-engine RUL prediction.

Key words: remaining useful life, knowledge distillation, convolutional neural networks, bidirectional long short-term memory network, multi-scale convolutional neural network

摘要: 航空发动机剩余使用寿命(RUL)预测对于提高飞机运行系统的安全性和降低维护成本至关重要。针对实际应用中边缘设备计算能力和内存有限的情况,保证模型RUL预测准确性的同时减少模型规模是亟待解决的问题。因此,提出了使用两次知识蒸馏压缩复杂的混合模型来预测航空发动机的RUL。首先,知识蒸馏用于异构网络之间进行知识转移,教师模型包含卷积神经网络(CNN)和双向长短期记忆网络(Bi-LSTM)等复杂网络,其中,CNN能够提取数据空间特征,而Bi-LSTM可以学习数据的双向长时间依赖性。学生模型则使用规模相对简单的多尺度卷积神经网络(MS-CNN),利用不同的卷积尺度各自提取数据信息,从而保证特征学习的完整性。其次,知识蒸馏用于同构网络之间进行知识迁移,学生网络与教师网络均使用多尺度卷积神经网络。最后,经过两次知识蒸馏获得的学生网络在C-MAPSS数据集上进行了RUL预测实验,结果显示蒸馏学生模型在4个数据集上预测准确性均有一定提升。蒸馏学生模型相比原始MS-CNN在FD001数据集获得的误差值降低了5.9%,同时也比Bi-LSTM误差值低6.7%,从而证明了所提方法在航空发动机RUL预测领域具有一定的竞争性。

关键词: 剩余使用寿命, 知识蒸馏, 卷积神经网络, 双向长短期记忆网络, 多尺度卷积神经网络

CLC Number: