Mel Spectrogram-based advanced deep temporal clustering model with unsupervised data for fault diagnosis

Geonkyo Hong, Dongjun Suh

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

Fault diagnosis of mechanical equipment using data-driven machine learning methods has been developed recently as a promising technique for improving the reliability of industrial systems. However, these methods suffer from data sparsity due to the difficulty in data collection, which limits the feature extraction of anomalies. To solve this problem, we propose the mel spectrogram-based advanced deep temporal clustering (ADTC) model, which can extract and verify the features of unlabeled data through an unsupervised learning based autoencoder and the K-means. In addition, the ADTC model uses the proposed centroid based learning to obtain calibrated unsupervised learning data by minimizing the data point and target centroid distances for misclustered encoder output features in ensemble-based unsupervised learning. The classifier of the ADTC model uses a supervised learning based deep support vector machine network model, which is robust to nonlinear data, to diagnose the faults of the mechanical equipment. The proposed ADTC model was validated using mechanical equipment dataset with data augmentation to address the imbalanced dataset problem. During experiments, the mel spectrogram-based ADTC model exhibited the best performance in the various industrial environment with a prediction accuracy as high as 98.06%, outperforming other compared algorithms.

Original languageEnglish
Article number119551
JournalExpert Systems with Applications
Volume217
DOIs
StatePublished - 1 May 2023

Keywords

  • Anomaly detection
  • Data augmentation
  • Fault diagnosis
  • Mel spectrogram
  • Time series
  • Unsupervised learning

Fingerprint

Dive into the research topics of 'Mel Spectrogram-based advanced deep temporal clustering model with unsupervised data for fault diagnosis'. Together they form a unique fingerprint.

Cite this