Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/thuml/Anomaly-Transformer
About Code release for "Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight), https://openreview.net/forum?id=LzQQ89U1qm_
https://github.com/thuml/Anomaly-Transformer
anomaly-detection deep-learning time-series
Last synced: about 1 month ago
JSON representation
About Code release for "Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight), https://openreview.net/forum?id=LzQQ89U1qm_
- Host: GitHub
- URL: https://github.com/thuml/Anomaly-Transformer
- Owner: thuml
- License: mit
- Created: 2022-04-22T02:25:58.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-12-29T11:36:58.000Z (12 months ago)
- Last Synced: 2024-08-03T09:09:43.450Z (5 months ago)
- Topics: anomaly-detection, deep-learning, time-series
- Language: Python
- Homepage:
- Size: 26 MB
- Stars: 681
- Watchers: 8
- Forks: 179
- Open Issues: 26
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- StarryDivineSky - thuml/Anomaly-Transformer
README
# Anomaly-Transformer (ICLR 2022 Spotlight)
Anomaly Transformer: Time Series Anomaly Detection with Association DiscrepancyUnsupervised detection of anomaly points in time series is a challenging problem, which requires the model to learn informative representation and derive a distinguishable criterion. In this paper, we propose the Anomaly Transformer in these three folds:
- An inherent distinguishable criterion as **Association Discrepancy** for detection.
- A new **Anomaly-Attention** mechanism to compute the association discrepancy.
- A **minimax strategy** to amplify the normal-abnormal distinguishability of the association discrepancy.
## Get Started
1. Install Python 3.6, PyTorch >= 1.4.0.
(Thanks Élise for the contribution in solving the environment. See this [issue](https://github.com/thuml/Anomaly-Transformer/issues/11) for details.)
2. Download data. You can obtain four benchmarks from [Google Cloud](https://drive.google.com/drive/folders/1gisthCoE-RrKJ0j3KPV7xiibhHWT9qRm?usp=sharing). **All the datasets are well pre-processed**. For the SWaT dataset, you can apply for it by following its official tutorial.
3. Train and evaluate. We provide the experiment scripts of all benchmarks under the folder `./scripts`. You can reproduce the experiment results as follows:
```bash
bash ./scripts/SMD.sh
bash ./scripts/MSL.sh
bash ./scripts/SMAP.sh
bash ./scripts/PSM.sh
```Especially, we use the adjustment operation proposed by [Xu et al, 2018](https://arxiv.org/pdf/1802.03903.pdf) for model evaluation. If you have questions about this, please see this [issue](https://github.com/thuml/Anomaly-Transformer/issues/14) or email us.
## Main Result
We compare our model with 15 baselines, including THOC, InterFusion, etc. **Generally, Anomaly-Transformer achieves SOTA.**
## Citation
If you find this repo useful, please cite our paper.```
@inproceedings{
xu2022anomaly,
title={Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy},
author={Jiehui Xu and Haixu Wu and Jianmin Wang and Mingsheng Long},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=LzQQ89U1qm_}
}
```## Contact
If you have any question, please contact [email protected].