https://github.com/openai/multiagent-competition
Code for the paper "Emergent Complexity via Multi-agent Competition"
https://github.com/openai/multiagent-competition
paper
Last synced: about 2 months ago
JSON representation
Code for the paper "Emergent Complexity via Multi-agent Competition"
- Host: GitHub
- URL: https://github.com/openai/multiagent-competition
- Owner: openai
- Created: 2017-10-11T15:29:00.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2023-04-02T11:57:48.000Z (about 2 years ago)
- Last Synced: 2025-04-13T07:49:15.071Z (about 2 months ago)
- Topics: paper
- Language: Python
- Homepage: https://arxiv.org/abs/1710.03748
- Size: 16.6 MB
- Stars: 812
- Watchers: 47
- Forks: 153
- Open Issues: 12
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
**Status:** Archive (code is provided as-is, no updates expected)
# Competitive Multi-Agent Environments
This repository contains the environments for the paper [Emergent Complexity via Multi-agent Competition](https://arxiv.org/abs/1710.03748)
## Dependencies
Use `pip install -r requirements.txt` to install dependencies. If you haven't used MuJoCo before, please refer to the [installation guide](https://github.com/openai/mujoco-py).
The code has been tested with the following dependencies:
* Python version 3.6
* [OpenAI GYM](https://github.com/openai/gym) version 0.9.1 with MuJoCo 1.31 support (use [mujoco-py version 0.5.7](https://github.com/openai/mujoco-py/tree/0.5))
* [Tensorflow](https://www.tensorflow.org/versions/r1.1/install/) version 1.1.0
* [Numpy](https://scipy.org/install.html) version 1.12.1## Installing Package
After installing all dependencies, make sure gym works with support for MuJoCo environments.
Next install `gym-compete` package as:
```bash
cd gym-compete
pip install -e .
```
Check install is successful by coming out of the directory and trying `import gym_compete` in python console. Some users might require a `sudo pip install`.## Trying the environments
Agent policies are provided for the various environments in folder `agent-zoo`. To see a demo of all the environments do:
```bash
bash demo_tasks.sh all
```
To instead try a single environment use:
```bash
bash demo_tasks.sh
```
where `` is one of: `run-to-goal-humans`, `run-to-goal-ants`, `you-shall-not-pass`, `sumo-ants`, `sumo-humans` and `kick-and-defend`