https://github.com/mrforexample/dppo-tf2
Simple DPPO & PPO implement use tensorflow v2
https://github.com/mrforexample/dppo-tf2
Last synced: 8 months ago
JSON representation
Simple DPPO & PPO implement use tensorflow v2
- Host: GitHub
- URL: https://github.com/mrforexample/dppo-tf2
- Owner: MrForExample
- Created: 2020-05-18T06:07:35.000Z (over 5 years ago)
- Default Branch: master
- Last Pushed: 2020-05-18T06:24:57.000Z (over 5 years ago)
- Last Synced: 2025-01-17T06:09:46.306Z (10 months ago)
- Language: Python
- Homepage:
- Size: 6.84 KB
- Stars: 1
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# DPPO-tf2
## Simple implement of DPPO & PPO base on tensorflow v2
- Algorithm base on paper [DeepMind DPPO](https://arxiv.org/abs/1707.02286) & [OpenAI PPO](https://arxiv.org/abs/1707.06347)
- Environment use OpenAI gym
- Plot use tensorboard