Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/MineDojo/Voyager
An Open-Ended Embodied Agent with Large Language Models
https://github.com/MineDojo/Voyager
embodied-learning large-language-models minecraft open-ended-learning
Last synced: about 2 months ago
JSON representation
An Open-Ended Embodied Agent with Large Language Models
- Host: GitHub
- URL: https://github.com/MineDojo/Voyager
- Owner: MineDojo
- License: mit
- Created: 2023-05-25T18:20:15.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-04-03T18:51:36.000Z (9 months ago)
- Last Synced: 2024-10-29T15:33:01.397Z (about 2 months ago)
- Topics: embodied-learning, large-language-models, minecraft, open-ended-learning
- Language: JavaScript
- Homepage: https://voyager.minedojo.org/
- Size: 5.03 MB
- Stars: 5,614
- Watchers: 64
- Forks: 528
- Open Issues: 12
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-agents - Voyager - Ended Embodied Agent with Large Language Models ![GitHub Repo stars](https://img.shields.io/github/stars/MineDojo/Voyager?style=social) (Game / Simulation)
- awesome-langchain-zh - Voyager
- awesome-langchain - Voyager - Ended Embodied Agent with Large Language Models ![GitHub Repo stars](https://img.shields.io/github/stars/MineDojo/Voyager?style=social) (Tools / Agents)
- StarryDivineSky - MineDojo/Voyager - 4交互,这绕过了模型参数微调。从经验上讲,Voyager表现出强大的上下文终身学习能力,并在玩Minecraft方面表现出非凡的熟练程度。它获得的独特物品增加了 3.3×,旅行距离延长了 2.3×,解锁关键科技树里程碑的速度比之前的 SOTA 快了 15.3×。Voyager能够利用在新的Minecraft中学到的技能库从头开始解决新任务,而其他技术则难以概括。 (A01_文本生成_文本对话 / 大语言对话模型及数据)
- AiTreasureBox - MineDojo/Voyager - 12-20_5747_0](https://img.shields.io/github/stars/MineDojo/Voyager.svg) |An Open-Ended Embodied Agent with Large Language Models| (Repos)
README
# Voyager: An Open-Ended Embodied Agent with Large Language Models
[[Website]](https://voyager.minedojo.org/)
[[Arxiv]](https://arxiv.org/abs/2305.16291)
[[PDF]](https://voyager.minedojo.org/assets/documents/voyager.pdf)
[[Tweet]](https://twitter.com/DrJimFan/status/1662115266933972993?s=20)[![Python Version](https://img.shields.io/badge/Python-3.9-blue.svg)](https://github.com/MineDojo/Voyager)
[![GitHub license](https://img.shields.io/github/license/MineDojo/Voyager)](https://github.com/MineDojo/Voyager/blob/main/LICENSE)
______________________________________________________________________https://github.com/MineDojo/Voyager/assets/25460983/ce29f45b-43a5-4399-8fd8-5dd105fd64f2
![](images/pull.png)
We introduce Voyager, the first LLM-powered embodied lifelong learning agent
in Minecraft that continuously explores the world, acquires diverse skills, and
makes novel discoveries without human intervention. Voyager consists of three
key components: 1) an automatic curriculum that maximizes exploration, 2) an
ever-growing skill library of executable code for storing and retrieving complex
behaviors, and 3) a new iterative prompting mechanism that incorporates environment
feedback, execution errors, and self-verification for program improvement.
Voyager interacts with GPT-4 via blackbox queries, which bypasses the need for
model parameter fine-tuning. The skills developed by Voyager are temporally
extended, interpretable, and compositional, which compounds the agent’s abilities
rapidly and alleviates catastrophic forgetting. Empirically, Voyager shows
strong in-context lifelong learning capability and exhibits exceptional proficiency
in playing Minecraft. It obtains 3.3× more unique items, travels 2.3× longer
distances, and unlocks key tech tree milestones up to 15.3× faster than prior SOTA.
Voyager is able to utilize the learned skill library in a new Minecraft world to
solve novel tasks from scratch, while other techniques struggle to generalize.In this repo, we provide Voyager code. This codebase is under [MIT License](LICENSE).
# Installation
Voyager requires Python ≥ 3.9 and Node.js ≥ 16.13.0. We have tested on Ubuntu 20.04, Windows 11, and macOS. You need to follow the instructions below to install Voyager.## Python Install
```
git clone https://github.com/MineDojo/Voyager
cd Voyager
pip install -e .
```## Node.js Install
In addition to the Python dependencies, you need to install the following Node.js packages:
```
cd voyager/env/mineflayer
npm install -g npx
npm install
cd mineflayer-collectblock
npx tsc
cd ..
npm install
```## Minecraft Instance Install
Voyager depends on Minecraft game. You need to install Minecraft game and set up a Minecraft instance.
Follow the instructions in [Minecraft Login Tutorial](installation/minecraft_instance_install.md) to set up your Minecraft Instance.
## Fabric Mods Install
You need to install fabric mods to support all the features in Voyager. Remember to use the correct Fabric version of all the mods.
Follow the instructions in [Fabric Mods Install](installation/fabric_mods_install.md) to install the mods.
# Getting Started
Voyager uses OpenAI's GPT-4 as the language model. You need to have an OpenAI API key to use Voyager. You can get one from [here](https://platform.openai.com/account/api-keys).After the installation process, you can run Voyager by:
```python
from voyager import Voyager# You can also use mc_port instead of azure_login, but azure_login is highly recommended
azure_login = {
"client_id": "YOUR_CLIENT_ID",
"redirect_url": "https://127.0.0.1/auth-response",
"secret_value": "[OPTIONAL] YOUR_SECRET_VALUE",
"version": "fabric-loader-0.14.18-1.19", # the version Voyager is tested on
}
openai_api_key = "YOUR_API_KEY"voyager = Voyager(
azure_login=azure_login,
openai_api_key=openai_api_key,
)# start lifelong learning
voyager.learn()
```* If you are running with `Azure Login` for the first time, it will ask you to follow the command line instruction to generate a config file.
* For `Azure Login`, you also need to select the world and open the world to LAN by yourself. After you run `voyager.learn()` the game will pop up soon, you need to:
1. Select `Singleplayer` and press `Create New World`.
2. Set Game Mode to `Creative` and Difficulty to `Peaceful`.
3. After the world is created, press `Esc` key and press `Open to LAN`.
4. Select `Allow cheats: ON` and press `Start LAN World`. You will see the bot join the world soon.# Resume from a checkpoint during learning
If you stop the learning process and want to resume from a checkpoint later, you can instantiate Voyager by:
```python
from voyager import Voyagervoyager = Voyager(
azure_login=azure_login,
openai_api_key=openai_api_key,
ckpt_dir="YOUR_CKPT_DIR",
resume=True,
)
```# Run Voyager for a specific task with a learned skill library
If you want to run Voyager for a specific task with a learned skill library, you should first pass the skill library directory to Voyager:
```python
from voyager import Voyager# First instantiate Voyager with skill_library_dir.
voyager = Voyager(
azure_login=azure_login,
openai_api_key=openai_api_key,
skill_library_dir="./skill_library/trial1", # Load a learned skill library.
ckpt_dir="YOUR_CKPT_DIR", # Feel free to use a new dir. Do not use the same dir as skill library because new events will still be recorded to ckpt_dir.
resume=False, # Do not resume from a skill library because this is not learning.
)
```
Then, you can run task decomposition. Notice: Occasionally, the task decomposition may not be logical. If you notice the printed sub-goals are flawed, you can rerun the decomposition.
```python
# Run task decomposition
task = "YOUR TASK" # e.g. "Craft a diamond pickaxe"
sub_goals = voyager.decompose_task(task=task)
```
Finally, you can run the sub-goals with the learned skill library:
```python
voyager.inference(sub_goals=sub_goals)
```For all valid skill libraries, see [Learned Skill Libraries](skill_library/README.md).
# FAQ
If you have any questions, please check our [FAQ](FAQ.md) first before opening an issue.# Paper and Citation
If you find our work useful, please consider citing us!
```bibtex
@article{wang2023voyager,
title = {Voyager: An Open-Ended Embodied Agent with Large Language Models},
author = {Guanzhi Wang and Yuqi Xie and Yunfan Jiang and Ajay Mandlekar and Chaowei Xiao and Yuke Zhu and Linxi Fan and Anima Anandkumar},
year = {2023},
journal = {arXiv preprint arXiv: Arxiv-2305.16291}
}
```Disclaimer: This project is strictly for research purposes, and not an official product from NVIDIA.