Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/XiongxiaoXu/Mambaformer-in-Time-Series
The official implementation of the paper: "SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting"
https://github.com/XiongxiaoXu/Mambaformer-in-Time-Series
Last synced: 4 months ago
JSON representation
The official implementation of the paper: "SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting"
- Host: GitHub
- URL: https://github.com/XiongxiaoXu/Mambaformer-in-Time-Series
- Owner: XiongxiaoXu
- Created: 2024-04-18T21:26:28.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-11-05T04:22:07.000Z (4 months ago)
- Last Synced: 2024-11-05T04:27:37.566Z (4 months ago)
- Language: Python
- Homepage: https://arxiv.org/abs/2404.14757
- Size: 17.5 MB
- Stars: 117
- Watchers: 4
- Forks: 6
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-state-space-models - GitHub
README
# SST
The SST (State Space Transformer) code for the paper "[SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting](https://arxiv.org/abs/2404.14757)".
## Contributions
* We propose to **decompose time series into global patterns and local variations according to ranges**. We identify that global patterns as the focus of long range and local variations should be captured in short range.
* To effectively capture long-term patterns and short-term variations, we leverage the patching to create coarser PTS in long range and finer PTS in short range. Moreover, we introduce **a new metric to precisely quantify the resolution of PTS**.
* We propose a **novel hybrid Mamba-Transformer experts architecture SST**, with Mamba as a global patterns expert in long range, and LWT as a local variations expert in short range. A long-short router is designed to adaptively integrate the global patterns and local variations. **With Mamba and LWT, SST is highly scalable with linear complexity O(L) on time series length L**.## Getting Started
### Environment
* python 3.10.13
* torch 1.12.1+cu116
* mamba-ssm 1.2.0.post1
* numpy 1.26.4
* transformers 4.38.2The installation of mamba-ssm package can refer to https://github.com/state-spaces/mamba.
### Run
To run SST on various dataset, run corrrsponidng dataset `.sh` files in the scripts folder.For exmaple, run SST on the Weather dataset: `./weather.sh`
### Dataset
You can download all the datasets from the "[Autoformer](https://drive.google.com/drive/folders/1ZOYpTUa82_jCcxIdTmyr0LXQfvaM9vIy)" project. Creatae a `dataset` folder in the current directory and put the downloaded datasets into `dataset` folder.## Acknowledgement
We would like to greatly thank the following awesome projects:Mamba (https://github.com/state-spaces/mamba)
PatchTST (https://github.com/yuqinie98/PatchTST)
LTSF-Linear (https://github.com/cure-lab/LTSF-Linear)
Autoformer (https://github.com/thuml/Autoformer)
## Cite
If you find this repository useful for your work, please consider citing the paper as follows:```bibtex
@misc{xu2024sstmultiscalehybridmambatransformer,
title={SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting},
author={Xiongxiao Xu and Canyu Chen and Yueqing Liang and Baixiang Huang and Guangji Bai and Liang Zhao and Kai Shu},
year={2024},
eprint={2404.14757},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2404.14757},
}
```