MAAT: Mamba Adaptive Anomaly Transformer with association discrepancy for time series

Created by MG96

External Public cs.LG

Statistics

Citations
0
References
42
Last updated
Loading...
Authors

Abdellah Zakaria Sellam Ilyes Benaissa Abdelmalik Taleb-Ahmed Luigi Patrono Cosimo Distante
Project Resources

Name Type Source Actions
ArXiv Paper Paper arXiv
Semantic Scholar Paper Semantic Scholar
Abstract

Anomaly detection in time series is essential for industrial monitoring and environmental sensing, yet distinguishing anomalies from complex patterns remains challenging. Existing methods like the Anomaly Transformer and DCdetector have progressed, but they face limitations such as sensitivity to short-term contexts and inefficiency in noisy, non-stationary environments. To overcome these issues, we introduce MAAT, an improved architecture that enhances association discrepancy modeling and reconstruction quality. MAAT features Sparse Attention, efficiently capturing long-range dependencies by focusing on relevant time steps, thereby reducing computational redundancy. Additionally, a Mamba-Selective State Space Model is incorporated into the reconstruction module, utilizing a skip connection and Gated Attention to improve anomaly localization and detection performance. Extensive experiments show that MAAT significantly outperforms previous methods, achieving better anomaly distinguishability and generalization across various time series applications, setting a new standard for unsupervised time series anomaly detection in real-world scenarios.

Note:

No note available for this project.

No note available for this project.
Contact:

No contact available for this project.

No contact available for this project.