Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/bashmocha/prompting-llms-for-aerial-navigation
Prompts and source code for applying LLMs to UAV-based navigation tasks with various model integrations
https://github.com/bashmocha/prompting-llms-for-aerial-navigation
aerial-navigation aerial-robotics airsim large-language-models llm prompt-engineering robotics
Last synced: 26 days ago
JSON representation
Prompts and source code for applying LLMs to UAV-based navigation tasks with various model integrations
- Host: GitHub
- URL: https://github.com/bashmocha/prompting-llms-for-aerial-navigation
- Owner: BashMocha
- Created: 2024-05-12T10:47:41.000Z (8 months ago)
- Default Branch: master
- Last Pushed: 2024-09-19T13:16:06.000Z (4 months ago)
- Last Synced: 2024-12-06T09:38:41.332Z (about 1 month ago)
- Topics: aerial-navigation, aerial-robotics, airsim, large-language-models, llm, prompt-engineering, robotics
- Language: Python
- Homepage:
- Size: 1.72 MB
- Stars: 9
- Watchers: 2
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
## Prompting Large Language Models for Aerial Navigation
Official implementation of the [UBMK 2024](https://ubmk.org.tr/) paper.Emirhan Balcı*, [Mehmet Sarıgül](http://mehmetsarigul.com/), [Barış Ata](https://barisata.me/)
https://github.com/user-attachments/assets/98fd3670-67ee-4d0e-8e2f-89cab301a495
## Abstract
Robots are becoming more prevalent and consequently utilized in numerous fields due to the latest advancements in artificial intelligence. Recent studies have shown promise in the human-robot interaction where non-experts are capable of handling the collaboration with robots. Whereas traditional interaction approaches are compact and rigid, natural language communication offers a coherent approach that allows interaction to be more versatile. The utilization of large language models (LLMs) makes it possible for non-expert users to take place in human-robot communications and manipulate robots to perform complex tasks such as aerial navigation, obstacle avoidance, and pathfinding. In this paper, we performed an experimental study to compare the performances of LLMs based on the generated source code from prompts to perform aerial navigation tasks in a simulated environment. The few-shot prompting technique is applied to LLMs such as ChatGPT, Gemini, Mistral, and Claude on Microsoft's AirSim drone simulation. We defined three test cases based on UAV-based aerial navigation, specified model prompts for each test, and extracted ground-truth trajectories for the test cases. Finally, we tested the models on the simulator with predefined prompts to compare the predicted trajectories with ground truth. Our findings indicate that no single model surpasses all test cases, using LLMs for aerial navigation remains a challenging task in robotic applications.[Paper](https://ieeexplore.ieee.org/document/10773467) | [Code](https://github.com/BashMocha/Prompting-LLMs-for-Aerial-Navigation/tree/master/src)
## Updates
11/12/2024: The paper is published in IEEE Xplore.
13/09/2024: The study is accepted by UBMK 2024! 🎉
10/08/2024: The paper with the code, dataset, and prompts is submitted to the conference.
## Prerequisites
> [!IMPORTANT]
> The project was written/tested on Windows. Thus, it does not guarantee functionality on other operating systems,
> and it is recommended to run it on Windows.- Prior to initiating the AirSim integration, it is essential to configure your API keys to enable access to the necessary models.
- Create a conda environment and install the AirSim client.
- Create the conda environment for a controlled and isolated space.
```
conda env create -f environment.yml
```
- Activate the environment and install the AirSim client.
```
conda activate llm-env
pip install airsim
```
- Clone the repository.
```
git clone https://github.com/CheesyFrappe/Prompting-LLMs-for-Aerial-Navigation.git
```
- Copy the API keys and paste them in the `API-KEY` field of the `./src/config.json` file.
- Download the simulation environment from [Releases](https://github.com/BashMocha/Prompting-LLMs-for-Aerial-Navigation/releases/), and unzip the package.
- Copy `settings.json` to `C:\Users\\Documents\AirSim\`.
- It is recommended to consult the [documentation](https://microsoft.github.io/AirSim/unreal_custenv/) for additional information on custom simulation environments if needed.## Usage
- Execute the AirSim simulation by running ```.\run.bat``` from the simulation folder.
- Once the simulation is up and running, run the source file for the model being used.
```
python chatgpt_airsim.py --testname first_test --model gpt-3.5-turbo
```
- The recording feature is enabled by default in this project. The trajectories can be found in `C:\Users\\Documents\AirSim\`.
- Execute ```evaluation.py``` to obtain the results and plot the trajectories.
```
python evaluation.py --reference_path ../dataset/first_test.txt --predicted_path
```## Citation
If you find the method or code useful, please cite:
```bibtex
@inproceedings{10773467,
author={Balcı, Emirhan and Sarıgül, Mehmet and Ata, Barış},
booktitle={2024 9th International Conference on Computer Science and Engineering (UBMK)},
title={Prompting Large Language Models for Aerial Navigation},
year={2024},
doi={10.1109/UBMK63289.2024.10773467}
}
```
---Feel free to [contact](mailto:[email protected]) for any questions.