Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/geyang/gym-distracting-control
this is a packaged version of the distracting control suite from Stone et al.
https://github.com/geyang/gym-distracting-control
Last synced: about 2 months ago
JSON representation
this is a packaged version of the distracting control suite from Stone et al.
- Host: GitHub
- URL: https://github.com/geyang/gym-distracting-control
- Owner: geyang
- Created: 2021-05-06T19:45:23.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2022-06-20T09:01:59.000Z (over 2 years ago)
- Last Synced: 2024-04-14T15:11:50.955Z (9 months ago)
- Language: Python
- Size: 1.68 MB
- Stars: 5
- Watchers: 2
- Forks: 4
- Open Issues: 2
-
Metadata Files:
- Readme: README
Awesome Lists containing this project
README
The Distracting Control Suite
=============================This is a packaged version of the ``distracting_control`` suite from
*Stone et al*. We provide OpenAI gym bindings, to make the original code-block
base easier to use.Getting Started
---------------.. code-block:: bash
pip install distracting_control
Then in your python script:
.. code-block:: python
import gym
env = gyn.make('gdc:Hopper-hop-easy-v1', from_pixel=True)
obs = env.reset()
doc.figure(obs, "figures/hopper_readme.png?raw=true")Detailed API
------------Take a look at the test file in the `https://github.com/geyang/distracting_control/blob/master/specs `__ folder, and
the source code-block. DeepMind control has a lot of low level binding burried
in the source code-block... code-block:: python
def test_max_episode_steps():
env = gym.make('distracting_control:Walker-walk-easy-v1')
assert env._max_episode_steps == 250def test_flat_obs():
env = gym.make('distracting_control:Walker-walk-easy-v1', frame_skip=4)
env.env.env.env.observation_spec()
assert env.reset().shape == (24,)def test_frame_skip():
env = gym.make('distracting_control:Walker-walk-easy-v1', from_pixels=True, frame_skip=8)
assert env._max_episode_steps == 125def test_channel_first():
env = gym.make('distracting_control:Walker-walk-easy-v1', from_pixels=True, channels_first=True)
assert env.reset().shape == (3, 84, 84)def test_channel_last():
env = gym.make('distracting_control:Walker-walk-easy-v1', from_pixels=True, frame_skip=8, channels_first=False)
assert env._max_episode_steps == 125
assert env.reset().shape == (84, 84, 3)Important Changes from *Stone et al*
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~1. [planned] remove tensorflow dependency
2. [planned] increase ground floor transparency in ``Hopper``Original README
---------------``distracting_control`` extends ``dm_control`` with static or dynamic
visual distractions in the form of changing colors, backgrounds, and
camera poses. Details and experimental results can be found in our
`paper `__.Requirements and Installation
------------------------------ Clone this repository
- ``sh run.sh``
- Follow the instructions and install
`dm_control `__.
Make sure you setup your MuJoCo keys correctly.
- Download the `DAVIS 2017
dataset `__. Make
sure to select the 2017 TrainVal - Images and Annotations (480p). The
training images will be used as distracting backgrounds.Instructions
------------- You can run the ``distracting_control_demo`` to generate sample
images of the different tasks at different difficulties:::
python distracting_control_demo --davis_path=$HOME/DAVIS/JPEGImages/480p/
--output_dir=/tmp/distrtacting_control_demo- As seen from the demo to generate an instance of the environment you
simply need to import the suite and use ``suite.load`` while
specifying the ``dm_control`` domain and task, then choosing a
difficulty and providing the dataset_path.- Note the environment follows the dm_control environment APIs.
Paper
-----If you use this code-block, please cite the accompanying
`paper `__ as:::
@article{stone2021distracting,
title={The Distracting Control Suite -- A Challenging Benchmark for Reinforcement Learning from Pixels},
author={Austin Stone and Oscar Ramirez and Kurt Konolige and Rico Jonschkowski},
year={2021},
journal={arXiv preprint arXiv:2101.02722},
}Disclaimer
----------This is not an official Google product.