https://github.com/tianheyu927/mil
Code for "One-Shot Visual Imitation Learning via Meta-Learning"
https://github.com/tianheyu927/mil
deep-learning imitation-learning meta-learning robotics
Last synced: 3 months ago
JSON representation
Code for "One-Shot Visual Imitation Learning via Meta-Learning"
- Host: GitHub
- URL: https://github.com/tianheyu927/mil
- Owner: tianheyu927
- License: mit
- Created: 2017-11-13T04:42:42.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-10-08T19:57:02.000Z (over 6 years ago)
- Last Synced: 2025-04-09T15:07:52.310Z (3 months ago)
- Topics: deep-learning, imitation-learning, meta-learning, robotics
- Language: Python
- Homepage: https://sites.google.com/view/one-shot-imitation
- Size: 65.4 KB
- Stars: 287
- Watchers: 11
- Forks: 70
- Open Issues: 6
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# One-Shot Visual Imitation Learning via Meta-Learning
*A TensorFlow implementation of the two papers [One-Shot Visual Imitation Learning via Meta-Learning (Finn*, Yu* et al., 2017)](https://arxiv.org/abs/1709.04905) and [One-Shot Imitation from Observing Humans
via Domain-Adaptive Meta-Learning (Yu*, Finn* et al., 2018)](https://arxiv.org/abs/1802.01557).* Here are the instructions to run our experiments shown in the paper.First clone the fork of the gym repo found [here](https://github.com/tianheyu927/gym), and following the instructions there to install gym. Switch to branch *mil*.
Then go to the `mil` directory and run `./scripts/get_data.sh` to download the data.
After downloading the data, training and testing scripts for MIL are available in `scripts/`.
**UPDATE (7/3/2018)**: to run the experiment with learned temporal loss as in the [One-Shot Imitation from Observing Humans
via Domain-Adaptive Meta-Learning](https://arxiv.org/abs/1802.01557) paper, take a look at `scripts/run_sim_push_video_only.sh` script.*Note: The code only includes the simulated experiments.*