https://github.com/deeplabcut/benchmark
https://github.com/deeplabcut/benchmark
Last synced: 30 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/deeplabcut/benchmark
- Owner: DeepLabCut
- License: other
- Created: 2021-09-19T14:23:14.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2024-09-16T12:10:25.000Z (8 months ago)
- Last Synced: 2025-03-25T06:09:46.815Z (about 2 months ago)
- Language: Python
- Homepage: https://benchmark.deeplabcut.org
- Size: 43.3 MB
- Stars: 1
- Watchers: 2
- Forks: 1
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
DeepLabCut Benchmark
====================[`[[ benchmark.deeplabcut.org ]]` ](https://benchmark.deeplabcut.org)
Welcome to the DeepLabCut benchmark!
This repo hosts all submitted results, which are available at [benchmark.deeplabcut.org](https://benchmark.deeplabcut.org).
If you are interested in submitting to the benchmark, please find detailed instructions on [benchmark.deeplabcut.org/submission](https://benchmark.deeplabcut.org/submission).Quickstart for developers
-------------------------The mandatory requirements for *building the benchmark page* can be installed via
``` bash
$ pip install -r requirements.txt
```The (non-public) ground truth data needs to be present in `data/`. Check that this is the case by running
``` bash
find data -type f
benchmark/data/CollectedData_Mackenzie.h5
benchmark/data/CollectedData_Daniel.h5
benchmark/data/CollectedData_Valentina.h5
benchmark/data/CollectedData_Mostafizur.h5
```For using all functionalities of this package and re-running evaluations, a [DeepLabCut](https://github.com/DeepLabCut/DeepLabCut/blob/master/docs/installation.md) installation is additionally required.
Check that the package works as expected by running
``` bash
python -m pytest tests
```which should finish without errors or warnings.
To re-evaluate all available models, run
``` bash
$ python -m benchmark
```or, if you want to run in debugging mode,
``` bash
python -m benchmark --nocache --onerror raise
```from the repository root.
To manually build the documentation, run
``` bash
$ make deploy
```