https://github.com/jldbc/bandits
Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset
https://github.com/jldbc/bandits
Last synced: 5 days ago
JSON representation
Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset
- Host: GitHub
- URL: https://github.com/jldbc/bandits
- Owner: jldbc
- License: mit
- Created: 2019-11-29T21:14:14.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-08-09T05:57:24.000Z (over 4 years ago)
- Last Synced: 2025-03-31T13:26:03.091Z (24 days ago)
- Language: Python
- Homepage:
- Size: 4.82 MB
- Stars: 56
- Watchers: 2
- Forks: 17
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Multi-Armed Bandits
Implementations of UCB1, Bayesian UCB, Epsilon Greedy, and EXP3 bandit algorithms on the Movielens-20m dataset. Algorithms are evaluated offline using replay.
To reproduce:
```
git clone https://github.com/jldbc/bandits
cd bandits/bandits
bash run.sh
```[Experiment setup details](https://jamesrledoux.com/algorithms/offline-bandit-evaluation/)
[Impementation details and results](https://jamesrledoux.com/algorithms/bandit-algorithms-epsilon-ucb-exp-python/)
Final results: