https://github.com/n3011/rexai
Play google chrome dyno T-rex game using tefla AI
https://github.com/n3011/rexai
artificial-intelligence reinforcement-learning tefla tensorflow trex-game
Last synced: 7 months ago
JSON representation
Play google chrome dyno T-rex game using tefla AI
- Host: GitHub
- URL: https://github.com/n3011/rexai
- Owner: n3011
- Created: 2019-02-04T18:58:41.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2019-02-04T19:00:22.000Z (over 6 years ago)
- Last Synced: 2025-01-29T18:13:22.131Z (8 months ago)
- Topics: artificial-intelligence, reinforcement-learning, tefla, tensorflow, trex-game
- Language: JavaScript
- Homepage:
- Size: 1.93 MB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# rexAI
Play Google's T-rex game using Deep Q-Learning.## Prerequisites
- tensorflow >= 1.9.0
- tefla## Getting started
### Installation
```Shell
git clone https://github.com/n3011/rexai.git
cd rexai
pip install -r requirements.txt
export PYTHONPATH=.
```### Webserver for running the javascript T-rex game
A simple webserver is required to run the T-rex javascript game.
```Shell
$ cd rextf/game
$ python2 -m SimpleHTTPServer
```
The game is now accessable on your localhost `127.0.0.1:port`.### Run training
```Shell
python main_loop.py --logdir /path/to/logir```
Open your browswer and press `F5` to start the game.## References
The game environment and some parts of the code are inspired from https://github.com/vdutor/TF-rex.