https://github.com/blackhc/batchbald
Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning.
https://github.com/blackhc/batchbald
activelearning deep-learning machine-learning reproduction-code
Last synced: 10 months ago
JSON representation
Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning.
- Host: GitHub
- URL: https://github.com/blackhc/batchbald
- Owner: BlackHC
- License: gpl-3.0
- Created: 2019-06-12T12:54:18.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2024-06-05T17:48:25.000Z (over 1 year ago)
- Last Synced: 2025-03-31T11:02:47.739Z (11 months ago)
- Topics: activelearning, deep-learning, machine-learning, reproduction-code
- Language: Python
- Homepage: https://blackhc.github.io/BatchBALD/
- Size: 12.5 MB
- Stars: 238
- Watchers: 8
- Forks: 55
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# BatchBALD
**Note:** A more modular re-implementation can be found at https://github.com/BlackHC/batchbald_redux.
---
This is the code drop for our paper
[BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning](https://arxiv.org/abs/1906.08158).
The code comes as is.
See https://github.com/BlackHC/batchbald_redux and https://blackhc.github.io/batchbald_redux/ for a reimplementation.
ElementAI's Baal framework also supports BatchBALD: https://github.com/ElementAI/baal/.
Please cite us:
```
@misc{kirsch2019batchbald,
title={BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning},
author={Andreas Kirsch and Joost van Amersfoort and Yarin Gal},
year={2019},
eprint={1906.08158},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
## How to run it
Make sure you install all requirements using
```
conda install pytorch torchvision cudatoolkit=10.0 -c pytorch
pip install -r requirements.txt
```
and you can start an experiment using:
```
python src/run_experiment.py --quickquick --num_inference_samples 10 --available_sample_k 40
```
which starts an experiment on a subset of MNIST with 10 MC dropout samples and acquisition size 40.
Have fun playing around with it!