An open API service indexing awesome lists of open source software.

https://github.com/jldbc/bandits

Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset
https://github.com/jldbc/bandits

Last synced: 5 days ago
JSON representation

Multi-Armed Bandit algorithms applied to the MovieLens 20M dataset

Awesome Lists containing this project

README

        

# Multi-Armed Bandits

Implementations of UCB1, Bayesian UCB, Epsilon Greedy, and EXP3 bandit algorithms on the Movielens-20m dataset. Algorithms are evaluated offline using replay.

To reproduce:

```
git clone https://github.com/jldbc/bandits
cd bandits/bandits
bash run.sh
```

[Experiment setup details](https://jamesrledoux.com/algorithms/offline-bandit-evaluation/)

[Impementation details and results](https://jamesrledoux.com/algorithms/bandit-algorithms-epsilon-ucb-exp-python/)

Final results: