https://github.com/XiongxiaoXu/SST
The official implementation of the paper: "SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting"
https://github.com/XiongxiaoXu/SST
Last synced: about 2 months ago
JSON representation
The official implementation of the paper: "SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting"
- Host: GitHub
- URL: https://github.com/XiongxiaoXu/SST
- Owner: XiongxiaoXu
- Created: 2024-04-18T21:26:28.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-02-21T15:14:32.000Z (3 months ago)
- Last Synced: 2025-02-21T16:26:38.724Z (3 months ago)
- Language: Python
- Homepage: https://arxiv.org/abs/2404.14757
- Size: 17.5 MB
- Stars: 146
- Watchers: 3
- Forks: 9
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- Awesome-state-space-models - GitHub
README
# SST
The SST (State Space Transformer) code for the paper "[SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting](https://arxiv.org/abs/2404.14757)".
## Contributions
* We propose to **decompose time series into global patterns and local variations according to ranges**. We identify that global patterns as the focus of long range and local variations should be captured in short range.
* To effectively capture long-term patterns and short-term variations, we leverage the patching to create coarser PTS in long range and finer PTS in short range. Moreover, we introduce **a new metric to precisely quantify the resolution of PTS**.
* We propose a **novel hybrid Mamba-Transformer experts architecture SST**, with Mamba as a global patterns expert in long range, and LWT as a local variations expert in short range. A long-short router is designed to adaptively integrate the global patterns and local variations. **With Mamba and LWT, SST is highly scalable with linear complexity O(L) on time series length L**.## Getting Started
### Environment
* python 3.10.13
* torch 1.12.1+cu116
* mamba-ssm 1.2.0.post1
* numpy 1.26.4
* transformers 4.38.2The installation of mamba-ssm package can refer to https://github.com/state-spaces/mamba.
### Run
To run SST on various dataset, run corrrsponidng dataset `.sh` files in the scripts folder.For exmaple, run SST on the Weather dataset: `./weather.sh`
### Dataset
You can download all the datasets from the "[Autoformer](https://drive.google.com/drive/folders/1ZOYpTUa82_jCcxIdTmyr0LXQfvaM9vIy)" project. Creatae a `dataset` folder in the current directory and put the downloaded datasets into `dataset` folder.## Acknowledgement
We would like to greatly thank the following awesome projects:Mamba (https://github.com/state-spaces/mamba)
PatchTST (https://github.com/yuqinie98/PatchTST)
LTSF-Linear (https://github.com/cure-lab/LTSF-Linear)
Autoformer (https://github.com/thuml/Autoformer)
## Cite
If you find this repository useful for your work, please consider citing the paper as follows (bib format from arxiv):```bibtex
@article{xu2024sst,
title={SST: Multi-Scale Hybrid Mamba-Transformer Experts for Long-Short Range Time Series Forecasting},
author={Xu, Xiongxiao and Chen, Canyu and Liang, Yueqing and Huang, Baixiang and Bai, Guangji and Zhao, Liang and Shu, Kai},
journal={arXiv preprint arXiv:2404.14757},
year={2024}
}
```