https://github.com/ravi72munde/chrome-dino-reinforcement-learning
An RL implementation in Keras
https://github.com/ravi72munde/chrome-dino-reinforcement-learning
chrome-dino-game convolutional-neural-networks q-learning reinforcement-learning
Last synced: 6 months ago
JSON representation
An RL implementation in Keras
- Host: GitHub
- URL: https://github.com/ravi72munde/chrome-dino-reinforcement-learning
- Owner: ravi72munde
- License: mit
- Created: 2018-02-13T03:56:40.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-07-04T17:26:32.000Z (almost 7 years ago)
- Last Synced: 2023-10-19T20:17:25.745Z (over 1 year ago)
- Topics: chrome-dino-game, convolutional-neural-networks, q-learning, reinforcement-learning
- Language: Jupyter Notebook
- Homepage: https://medium.com/acing-ai/how-i-build-an-ai-to-play-dino-run-e37f37bdf153
- Size: 138 MB
- Stars: 98
- Watchers: 8
- Forks: 19
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Chrome-Dino-Reinforcement-Learning
NOTE: This repo is the basic implementation with few limitations. Please refer the new repo at https://github.com/Paperspace/DinoRunTutorial
https://blog.paperspace.com/dino-run/
A Deep Convolutional Neural Network to play Google Chrome's offline Dino Run game by learning action patterns from visual input using a model-less Reinforcement Learning Algorithm
NOTE: This is a basic-implementation repository with some limitations. Please refer https://github.com/Paperspace/DinoRunTutorial where I've used a GPU VM for better results, with scores upto 4000
Refer the jupyter notebook for detailed implementation :
https://github.com/ravi72munde/Chrome-Dino-Reinforcement-Learning/blob/master/Reinforcement%20Learning%20Dino%20Run.ipynb# Installation
Start by cloning the repository
`$ git clone https://github.com/ravi72munde/Chrome-Dino-Reinforcement-Learning.git`
`Dependencies can be installed using pip install or conda install for Anaconda environment`Dependencies
- Python 3.6
- Selenium
- OpenCV
- PIL
- Keras
- Chromium driver for Selenium
# Sample Game Play
https://youtu.be/0oOOqGFmlDs