Open this publication in new window or tab >>Show others...
2026 (English)In: IEEE Transactions on Industrial Informatics, ISSN 1551-3203, E-ISSN 1941-0050Article in journal (Refereed) Epub ahead of print
Abstract [en]
Automatic fault detection and diagnosis (FDD) are critical for maintaining reliable and efficient industrial systems. However, conventional methods rely heavily on manual inspections or threshold-based techniques, which often fail to capture the dynamic patterns in time series (TS) sensor data. As a result, faults persist for extended periods, leading to suboptimal system operations, increased energy waste, and significant economic losses. This work proposes a cross-modal framework that facilitates the efficient deployment of state-of-the-art pretrained vision models for enhanced FDD, with two novel TS-to-image transformations: first, an adapter deep encoder that learns optimal, task-specific representations from raw sensor data while generating outputs that are input-compliant with pretrained models. Second, an enhanced line plot that creates geometric shapes of two related signals. Comparative experiments against fixed methods, including spectrograms, Gramian angular fields, Markov transition fields, recurrence plots, and five deep learning baseline models, showed substantial performance gains across diverse domains. InceptionTime achieved the highest average baseline performance with an F<inf>1</inf> of 88.6%, while the adapter and shapes achieved 94.4% and 92.4%, respectively. The findings highlight the potential of the cross-modal framework for FDD to facilitate early intervention and efficient system maintenance in industrial settings.
Place, publisher, year, edition, pages
IEEE Computer Society, 2026
Keywords
Cross-modal adaptation, deep learning, fault detection and diagnosis (FDD), pretrained vision models, time series (TS), transfer learning (TL)
National Category
Artificial Intelligence Industrial engineering and management
Identifiers
urn:nbn:se:bth-29203 (URN)10.1109/TII.2026.3659264 (DOI)001691144300001 ()2-s2.0-105030196076 (Scopus ID)
2026-02-272026-02-272026-02-27Bibliographically approved