https://github.com/timeeval/timeeval-gui
[Read-Only Mirror] Benchmarking Toolkit for Time Series Anomaly Detection Algorithms using TimeEval and GutenTAG
https://github.com/timeeval/timeeval-gui
benchmark-framework benchmarking jupyter-notebooks numpy pandas python3 streamlit time-series time-series-analysis time-series-anomaly-detection
Last synced: 7 days ago
JSON representation
[Read-Only Mirror] Benchmarking Toolkit for Time Series Anomaly Detection Algorithms using TimeEval and GutenTAG
- Host: GitHub
- URL: https://github.com/timeeval/timeeval-gui
- Owner: TimeEval
- License: mit
- Created: 2022-03-23T14:16:39.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2025-04-06T17:09:47.000Z (10 months ago)
- Last Synced: 2025-09-04T19:39:40.089Z (5 months ago)
- Topics: benchmark-framework, benchmarking, jupyter-notebooks, numpy, pandas, python3, streamlit, time-series, time-series-analysis, time-series-anomaly-detection
- Language: Python
- Homepage:
- Size: 701 KB
- Stars: 28
- Watchers: 5
- Forks: 8
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Citation: CITATION.cff
Awesome Lists containing this project
README
TimeEval GUI / Toolkit
A Benchmarking Toolkit for Time Series Anomaly Detection Algorithms
[](https://opensource.org/licenses/MIT)

> If you use our artifacts, please consider [citing our papers](#Citation).
This repository hosts an extensible, scalable and automatic benchmarking toolkit for time series anomaly detection algorithms.
TimeEval includes an extensive data generator and supports both interactive and batch evaluation scenarios.
With our novel toolkit, we aim to ease the evaluation effort and help the community to provide more meaningful evaluations.
The following picture shows the architecture of the TimeEval Toolkit:

It consists of four main components: a visual frontend for interactive experiments, the Python API to programmatically configure systematic batch experiments, the dataset generator GutenTAG, and the core evaluation engine (Time)Eval.
While the frontend is hosted in this repository, GutenTAG and Eval are hosted in separate repositories.
Those repositories also include their respective Python APIs:
[](https://github.com/TimeEval/gutentag)
[](https://github.com/TimeEval/timeeval)
As initial resources for evaluations, we provide over 1,000 benchmark datasets and an increasing number of time series anomaly detection algorithms (over 70):
[](https://timeeval.github.io/evaluation-paper/notebooks/Datasets.html)
[](https://github.com/TimeEval/TimeEval-algorithms)
## Installation and Usage (tl;dr)
TimeEval is tested on Linux and Mac operating systems and supports Python 3.7 until 3.9.
We don't support Python 3.10 or higher at the moment because downstream libraries are incompatible.
> We haven't tested if TimeEval runs on Windows.
> If you use Windows, please help us and test if TimeEval runs correctly.
> If there are any issues, don't hesitate to contact us.
By default, TimeEval does not automatically download all available algorithms (Docker images), because there are just too many.
However, you can download them easily [from our registry](https://github.com/orgs/TimeEval/packages?repo_name=TimeEval-algorithms) using docker.
Please download the correct tag for the algorithm, compatible with your version of TimeEval:
```bash
docker pull ghcr.io/timeeval/kmeans:0.3.0
```
After you have downloaded the algorithm images, you need to restart the GUI, so that it can find the new images.
### Web frontend
```shell
# install all dependencies
make install
# execute streamlit and display frontend in default browser
make run
```
Screenshots of web frontend:



### Python APIs
Install the required components using pip:
```bash
# eval component:
pip install timeeval
# dataset generator component:
pip install timeeval-gutentag
```
For usage instructions of the respective Python APIs, please consider the project's documentation:
[](https://github.com/TimeEval/gutentag)
[](https://github.com/TimeEval/timeeval)
## Citation
If you use the TimeEval toolkit or any of its components in your project or research, please cite our demonstration paper:
> Phillip Wenig, Sebastian Schmidl, and Thorsten Papenbrock.
> TimeEval: A Benchmarking Toolkit for Time Series Anomaly Detection Algorithms. PVLDB, 15(12): 3678 - 3681, 2022.
> doi:[10.14778/3554821.3554873](https://doi.org/10.14778/3554821.3554873)
If you use our evaluation results or our benchmark datasets and algorithms, please cite our evaluation paper:
> Sebastian Schmidl, Phillip Wenig, and Thorsten Papenbrock.
> Anomaly Detection in Time Series: A Comprehensive Evaluation. PVLDB, 15(9): 1779 - 1797, 2022.
> doi:[10.14778/3538598.3538602](https://doi.org/10.14778/3538598.3538602)
You can use the following BibTeX entries:
```bibtex
@article{WenigEtAl2022TimeEval,
title = {TimeEval: {{A}} Benchmarking Toolkit for Time Series Anomaly Detection Algorithms},
author = {Wenig, Phillip and Schmidl, Sebastian and Papenbrock, Thorsten},
date = {2022},
journaltitle = {Proceedings of the {{VLDB Endowment}} ({{PVLDB}})},
volume = {15},
number = {12},
pages = {3678--3681},
doi = {10.14778/3554821.3554873}
}
@article{SchmidlEtAl2022Anomaly,
title = {Anomaly Detection in Time Series: {{A}} Comprehensive Evaluation},
author = {Schmidl, Sebastian and Wenig, Phillip and Papenbrock, Thorsten},
date = {2022},
journaltitle = {Proceedings of the {{VLDB Endowment}} ({{PVLDB}})},
volume = {15},
number = {9},
pages = {1779--1797},
doi = {10.14778/3538598.3538602}
}
```