Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/bgalbraith/bandits
Python library for Multi-Armed Bandits
https://github.com/bgalbraith/bandits
Last synced: 4 days ago
JSON representation
Python library for Multi-Armed Bandits
- Host: GitHub
- URL: https://github.com/bgalbraith/bandits
- Owner: bgalbraith
- License: apache-2.0
- Created: 2016-04-28T00:26:08.000Z (over 8 years ago)
- Default Branch: master
- Last Pushed: 2020-02-11T13:24:51.000Z (over 4 years ago)
- Last Synced: 2024-08-02T04:02:28.553Z (3 months ago)
- Language: Jupyter Notebook
- Size: 694 KB
- Stars: 733
- Watchers: 35
- Forks: 156
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-computational-neuroscience - Bandits - Python library for Multi-Armed Bandits implements the following algorithms: Epsilon-Greedy, UCB1, Softmax, Thompson Sampling (Package / Computational Cognitive Science)
- awesome-machine-learning-resources - **[Library
README
# Bandits
Python library for Multi-Armed BanditsImplements the following algorithms:
* Epsilon-Greedy
* UCB1
* Softmax
* Thompson Sampling (Bayesian)
* Bernoulli, Binomial <=> Beta Distributions## Installation
You can install `bandits` with:```
git clone https://github.com/bgalbraith/bandits.git
cd bandits
pip install .
```# Examples
* [Bayesian Belief](https://github.com/bgalbraith/bandits/tree/master/notebooks/Stochastic%20Bandits%20-%20Bayesian%20Belief.ipynb)
* [Value Estimation Methods](https://github.com/bgalbraith/bandits/tree/master/notebooks/Stochastic%20Bandits%20-%20Value%20Estimation.ipynb)
* [Preference Estimation](https://github.com/bgalbraith/bandits/tree/master/notebooks/Stochastic%20Bandits%20-%20Preference%20Estimation.ipynb)# References
### Wikipedia
* [Multi-Armed Bandit](https://en.wikipedia.org/wiki/Multi-armed_bandit)
* [Conjugate Prior](https://en.wikipedia.org/wiki/Conjugate_prior)### Blog Posts
* [When to Run Bandit Tests Instead of A/B/n Tests](https://conversionxl.com/bandit-tests/)
* [Bandit theory, part I](https://blogs.princeton.edu/imabandit/2016/05/11/bandit-theory-part-i/)
* [Bandit theory, part II](https://blogs.princeton.edu/imabandit/2016/05/13/bandit-theory-part-ii/)
* [Bandits for Recommendation Systems](http://engineering.richrelevance.com/bandits-recommendation-systems/)
* [Recommendations with Thompson Sampling](http://engineering.richrelevance.com/recommendations-thompson-sampling/)
* [Personalization with Contextual Bandits](http://engineering.richrelevance.com/personalization-contextual-bandits/)
* [Bayesian Bandits - optimizing click throughs with statistics](https://www.chrisstucchio.com/blog/2013/bayesian_bandit.html)
* [Mulit-Armed Bandits](https://dataorigami.net/blogs/napkin-folding/79031811-multi-armed-bandits)
* [Bayesian Bandits](http://tdunning.blogspot.de/2012/02/bayesian-bandits.html)
* [Python Multi-armed Bandits (and Beer!)](http://blog.yhat.com/posts/the-beer-bandit.html)### Presentations
* [Boston Bayesians Meetup 2016 - Bayesian Bandits From Scratch](https://sites.google.com/site/simplebayes/home/boston-bayesians)
* [ODSC East 2016 - Bayesian Bandits](https://goo.gl/TJt8sG)
* [NYC ML Meetup 2010 - Learning for Contextual Bandits](http://hunch.net/~exploration_learning/main.pdf)### Books and Book Chapters
* [Reinforcement Learning: An Introduction](https://webdocs.cs.ualberta.ca/~sutton/book/the-book.html)
* [Multi-armed Bandit Allocation Indices](http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470670029.html)
* [Bandit Algorithms for Website Optimization](http://shop.oreilly.com/product/0636920027393.do)
* [Multi-Armed Bandit Problems (in Foundations and Applications of Sensor Management)](http://web.eecs.umich.edu/~teneket/pubs/MAB-Survey.pdf)### Academic Articles
* [A Survey on Contextual Multi-armed Bandits](http://arxiv.org/abs/1508.03326)### Software / Tools
* [Bayesian Bandit Explorer (simulation)](https://learnforeverlearn.com/bandits/)
* [Yelp MOE](http://yelp.github.io/MOE/bandit.html)
* [Bandit Algorithms for Website Optimization (code)](https://github.com/johnmyleswhite/BanditsBook)