https://github.com/5had3z/chasing-targets-gym
https://github.com/5had3z/chasing-targets-gym
Last synced: about 2 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/5had3z/chasing-targets-gym
- Owner: 5had3z
- License: mit
- Created: 2023-03-09T04:46:15.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-12-21T01:07:10.000Z (6 months ago)
- Last Synced: 2025-04-03T01:04:21.916Z (3 months ago)
- Language: Python
- Size: 2.7 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.rst
- License: LICENSE
Awesome Lists containing this project
README
===================
Chasing Targets Gym
===================|version| |python| |license| |codestyle|
.. |version| image:: https://img.shields.io/pypi/v/chasing-targets-gym
:target: https://pypi.org/project/chasing-targets-gym/
:alt: PyPI - Package Version
.. |python| image:: https://img.shields.io/pypi/pyversions/chasing-targets-gym
:target: https://pypi.org/project/chasing-targets-gym/
:alt: PyPI - Python Version
.. |license| image:: https://img.shields.io/pypi/l/chasing-targets-gym
:target: https://github.com/5had3z/chasing-targets-gym/blob/main/LICENSE
:alt: PyPI - License
.. |codestyle| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/blackIntroduction
------------This is a simple gym environment that sets up a set of robots and targets for them to chase.
These targets are dumb, they simply move at a constant speed and bounce off the "limits" of
the simulation environment. The intention is that the robots will chase after these targets,
and switch to a new target after catching their current one. The targets are "transparent" and
robots are free to ignore "avoiding them", the intention is that they avoid each other. An
example of a simulation with robot controller is shown below... image:: misc/example_sim.gif
Usage
-----Since this uses the gymnasium, you can spin an environment up same as any other env, and you can use our optimized planner. A script is included that shows of this planner and environment when you install this library ```chasing-targets-example --max-step=500```.
.. code:: python
from gymnasium import Env, make
from chasing_targets_gym.planner import Plannerenv: Env = make(
"ChasingTargets-v0",
render_mode="human",
n_robots=10,
n_targets=3,
robot_radius=0.1,
max_velocity=0.5,
target_velocity_std=0.5,
max_episode_steps=1000,
)planner = Planner(
env.get_wrapper_attr("robot_radius"),
env.get_wrapper_attr("dt"),
env.get_wrapper_attr("max_velocity"),
)Installation
------------Either you can clone and pip install the source, or you can install via pypi.
If installing from source seems to stall for no apparent reason, try --no-build-isolation... code:: bash
git clone https://github.com/5had3z/chasing-targets-gym && cd chasing-targets-gym && pip3 install -e .
Otherwise install pypi package
.. code:: bash
pip3 install chasing-targets-gym
Some Credit
-----------I was pointed to a basic environment `here `_ but it didn't
really match what I wanted, so I made my own based off this.