https://github.com/KimMeen/Time-LLM
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
https://github.com/KimMeen/Time-LLM
cross-modal-learning cross-modality deep-learning language-model large-language-models machine-learning multimodal-deep-learning multimodal-time-series prompt-tuning time-series time-series-analysis time-series-forecast time-series-forecasting
Last synced: 25 days ago
JSON representation
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
- Host: GitHub
- URL: https://github.com/KimMeen/Time-LLM
- Owner: KimMeen
- License: apache-2.0
- Created: 2024-01-20T01:26:30.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-11-03T09:23:04.000Z (6 months ago)
- Last Synced: 2025-04-05T02:01:35.860Z (27 days ago)
- Topics: cross-modal-learning, cross-modality, deep-learning, language-model, large-language-models, machine-learning, multimodal-deep-learning, multimodal-time-series, prompt-tuning, time-series, time-series-analysis, time-series-forecast, time-series-forecasting
- Language: Python
- Homepage: https://arxiv.org/abs/2310.01728
- Size: 1.06 MB
- Stars: 1,906
- Watchers: 21
- Forks: 331
- Open Issues: 23
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- StarryDivineSky - KimMeen/Time-LLM - LLM: 通过重新编程大型语言模型进行时间序列预测。Time-LLM 是一种重编程框架,用于重新用于LLMs一般时间序列预测,同时保持骨干语言模型不变。值得注意的是,我们发现时间序列分析(例如,预测)可以被看作是另一个“语言任务”,可以由现成LLM的。时间-LLM包括两个关键组成部分:(1)将输入时间序列重新编程为对用户LLM来说更自然的文本原型表示,以及(2)通过声明性提示(例如,领域专家知识和任务指令)增强输入上下文以指导LLM推理。(2024 年 3 月):Time-LLM 已升级为通用框架,用于将各种语言模型重新用于时间序列预测。它现在默认支持 Llama-7B,并包括与另外两个较小的 PLM(GPT-2 和 BERT)的兼容性。只需调整 --llm_model 和 --llm_dim 即可切换主干。Time-LLM 已被包含在 NeuralForecast 中。时间LLM 已被 XiMou Optimization Technology Co., Ltd. (XMO) 用于太阳能、风能和天气预报。 (时间序列 / 网络服务_其他)
README
(ICLR'24) Time-LLM: Time Series Forecasting by Reprogramming Large Language Models



**[Paper Page]**
**[YouTube Talk 1]**
**[YouTube Talk 2]**
**[Medium Blog]****[机器之心中文解读]**
**[量子位中文解读]**
**[时序人中文解读]**
**[AI算法厨房中文解读]**
**[知乎中文解读]**
---
>
> 🙋 Please let us know if you find out a mistake or have any suggestions!
>
> 🌟 If you find this resource helpful, please consider to star this repository and cite our research:```
@inproceedings{jin2023time,
title={{Time-LLM}: Time series forecasting by reprogramming large language models},
author={Jin, Ming and Wang, Shiyu and Ma, Lintao and Chu, Zhixuan and Zhang, James Y and Shi, Xiaoming and Chen, Pin-Yu and Liang, Yuxuan and Li, Yuan-Fang and Pan, Shirui and Wen, Qingsong},
booktitle={International Conference on Learning Representations (ICLR)},
year={2024}
}
```## Updates/News:
🚩 **News** (Aug. 2024): Time-LLM has been adopted by XiMou Optimization Technology Co., Ltd. (XMO) for Solar, Wind, and Weather Forecasting.
🚩 **News** (May 2024): Time-LLM has been included in [NeuralForecast](https://github.com/Nixtla/neuralforecast). Special thanks to the contributor @[JQGoh](https://github.com/JQGoh) and @[marcopeix](https://github.com/marcopeix)!
🚩 **News** (March 2024): Time-LLM has been upgraded to serve as a general framework for repurposing a wide range of language models to time series forecasting. It now defaults to supporting Llama-7B and includes compatibility with two additional smaller PLMs (GPT-2 and BERT). Simply adjust `--llm_model` and `--llm_dim` to switch backbones.
## Introduction
Time-LLM is a reprogramming framework to repurpose LLMs for general time series forecasting with the backbone language models kept intact.
Notably, we show that time series analysis (e.g., forecasting) can be cast as yet another "language task" that can be effectively tackled by an off-the-shelf LLM.
![]()
- Time-LLM comprises two key components: (1) reprogramming the input time series into text prototype representations that are more natural for the LLM, and (2) augmenting the input context with declarative prompts (e.g., domain expert knowledge and task instructions) to guide LLM reasoning.
![]()
## Requirements
Use python 3.11 from MiniConda- torch==2.2.2
- accelerate==0.28.0
- einops==0.7.0
- matplotlib==3.7.0
- numpy==1.23.5
- pandas==1.5.3
- scikit_learn==1.2.2
- scipy==1.12.0
- tqdm==4.65.0
- peft==0.4.0
- transformers==4.31.0
- deepspeed==0.14.0
- sentencepiece==0.2.0To install all dependencies:
```
pip install -r requirements.txt
```## Datasets
You can access the well pre-processed datasets from [[Google Drive]](https://drive.google.com/file/d/1NF7VEefXCmXuWNbnNe858WvQAkJ_7wuP/view?usp=sharing), then place the downloaded contents under `./dataset`## Quick Demos
1. Download datasets and place them under `./dataset`
2. Tune the model. We provide five experiment scripts for demonstration purpose under the folder `./scripts`. For example, you can evaluate on ETT datasets by:```bash
bash ./scripts/TimeLLM_ETTh1.sh
```
```bash
bash ./scripts/TimeLLM_ETTh2.sh
```
```bash
bash ./scripts/TimeLLM_ETTm1.sh
```
```bash
bash ./scripts/TimeLLM_ETTm2.sh
```## Detailed usage
Please refer to ```run_main.py```, ```run_m4.py``` and ```run_pretrain.py``` for the detailed description of each hyperparameter.
## Further Reading
1, [**TimeMixer++: A General Time Series Pattern Machine for Universal Predictive Analysis**](https://arxiv.org/abs/2410.16032), in *arXiv* 2024.
[\[GitHub Repo\]](https://github.com/kwuking/TimeMixer/blob/main/README.md)**Authors**: Shiyu Wang, Jiawei Li, Xiaoming Shi, Zhou Ye, Baichuan Mo, Wenze Lin, Shengtong Ju, Zhixuan Chu, Ming Jin
```bibtex
@article{wang2024timemixer++,
title={TimeMixer++: A General Time Series Pattern Machine for Universal Predictive Analysis},
author={Wang, Shiyu and Li, Jiawei and Shi, Xiaoming and Ye, Zhou and Mo, Baichuan and Lin, Wenze and Ju, Shengtong and Chu, Zhixuan and Jin, Ming},
journal={arXiv preprint arXiv:2410.16032},
year={2024}
}
```2, [**Foundation Models for Time Series Analysis: A Tutorial and Survey**](https://arxiv.org/pdf/2403.14735), in *KDD* 2024.
**Authors**: Yuxuan Liang, Haomin Wen, Yuqi Nie, Yushan Jiang, Ming Jin, Dongjin Song, Shirui Pan, Qingsong Wen*
```bibtex
@inproceedings{liang2024foundation,
title={Foundation models for time series analysis: A tutorial and survey},
author={Liang, Yuxuan and Wen, Haomin and Nie, Yuqi and Jiang, Yushan and Jin, Ming and Song, Dongjin and Pan, Shirui and Wen, Qingsong},
booktitle={ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD 2024)},
year={2024}
}
```3, [**Position Paper: What Can Large Language Models Tell Us about Time Series Analysis**](https://arxiv.org/abs/2402.02713), in *ICML* 2024.
**Authors**: Ming Jin, Yifan Zhang, Wei Chen, Kexin Zhang, Yuxuan Liang*, Bin Yang, Jindong Wang, Shirui Pan, Qingsong Wen*
```bibtex
@inproceedings{jin2024position,
title={Position Paper: What Can Large Language Models Tell Us about Time Series Analysis},
author={Ming Jin and Yifan Zhang and Wei Chen and Kexin Zhang and Yuxuan Liang and Bin Yang and Jindong Wang and Shirui Pan and Qingsong Wen},
booktitle={International Conference on Machine Learning (ICML 2024)},
year={2024}
}
```4, [**Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook**](https://arxiv.org/abs/2310.10196), in *arXiv* 2023.
[\[GitHub Repo\]](https://github.com/qingsongedu/Awesome-TimeSeries-SpatioTemporal-LM-LLM)**Authors**: Ming Jin, Qingsong Wen*, Yuxuan Liang, Chaoli Zhang, Siqiao Xue, Xue Wang, James Zhang, Yi Wang, Haifeng Chen, Xiaoli Li (IEEE Fellow), Shirui Pan*, Vincent S. Tseng (IEEE Fellow), Yu Zheng (IEEE Fellow), Lei Chen (IEEE Fellow), Hui Xiong (IEEE Fellow)
```bibtex
@article{jin2023lm4ts,
title={Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook},
author={Ming Jin and Qingsong Wen and Yuxuan Liang and Chaoli Zhang and Siqiao Xue and Xue Wang and James Zhang and Yi Wang and Haifeng Chen and Xiaoli Li and Shirui Pan and Vincent S. Tseng and Yu Zheng and Lei Chen and Hui Xiong},
journal={arXiv preprint arXiv:2310.10196},
year={2023}
}
```5, [**Transformers in Time Series: A Survey**](https://arxiv.org/abs/2202.07125), in IJCAI 2023.
[\[GitHub Repo\]](https://github.com/qingsongedu/time-series-transformers-review)**Authors**: Qingsong Wen, Tian Zhou, Chaoli Zhang, Weiqi Chen, Ziqing Ma, Junchi Yan, Liang Sun
```bibtex
@inproceedings{wen2023transformers,
title={Transformers in time series: A survey},
author={Wen, Qingsong and Zhou, Tian and Zhang, Chaoli and Chen, Weiqi and Ma, Ziqing and Yan, Junchi and Sun, Liang},
booktitle={International Joint Conference on Artificial Intelligence(IJCAI)},
year={2023}
}
```6, [**TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting**](https://openreview.net/pdf?id=7oLshfEIC2), in ICLR 2024.
[\[GitHub Repo\]](https://github.com/kwuking/TimeMixer)**Authors**: Shiyu Wang, Haixu Wu, Xiaoming Shi, Tengge Hu, Huakun Luo, Lintao Ma, James Y. Zhang, Jun Zhou
```bibtex
@inproceedings{wang2023timemixer,
title={TimeMixer: Decomposable Multiscale Mixing for Time Series Forecasting},
author={Wang, Shiyu and Wu, Haixu and Shi, Xiaoming and Hu, Tengge and Luo, Huakun and Ma, Lintao and Zhang, James Y and ZHOU, JUN},
booktitle={International Conference on Learning Representations (ICLR)},
year={2024}
}
```## Acknowledgement
Our implementation adapts [Time-Series-Library](https://github.com/thuml/Time-Series-Library) and [OFA (GPT4TS)](https://github.com/DAMO-DI-ML/NeurIPS2023-One-Fits-All) as the code base and have extensively modified it to our purposes. We thank the authors for sharing their implementations and related resources.