https://github.com/noxs1d/action-model
FastAPI-based application that integrates GPT-4o-mini to generate, validate, and execute Bash commands inside a controlled Docker environment. Uses LangGraph for state management and supports session-based execution.
https://github.com/noxs1d/action-model
async bash-script docker fastapi langgraph langgraph-python llm
Last synced: 5 months ago
JSON representation
FastAPI-based application that integrates GPT-4o-mini to generate, validate, and execute Bash commands inside a controlled Docker environment. Uses LangGraph for state management and supports session-based execution.
- Host: GitHub
- URL: https://github.com/noxs1d/action-model
- Owner: noxs1d
- Created: 2025-02-22T09:48:00.000Z (11 months ago)
- Default Branch: master
- Last Pushed: 2025-03-10T07:56:22.000Z (11 months ago)
- Last Synced: 2025-07-22T20:02:32.138Z (6 months ago)
- Topics: async, bash-script, docker, fastapi, langgraph, langgraph-python, llm
- Language: Python
- Homepage:
- Size: 13.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Action Model
## Overview
Action Model is a FastAPI-based application that interacts with an LLM (GPT-4o-mini) to generate, validate, and execute Bash commands in a controlled environment. The system uses LangGraph for state management and Docker for script execution.
## Features
🛠 **FastAPI Backend**: Provides an API for handling user queries.
⚡ **Asynchronous Execution**: Uses async/await for non-blocking operations.
🤖 **LLM Integration**: Interacts with GPT-4o-mini to generate and validate Bash commands.
🔍 **Command Validation**: Ensures that generated commands are correct before execution.
🖥 **Bash Script Execution**: Runs validated commands inside a controlled Docker environment.
📜 **Session-Based Communication**: Supports thread-based execution with a session ID.
## File Structure
```
.
├── fast.py # FastAPI application
├── main.py # CLI-based interaction
├── model.py # LangGraph state management & LLM interactions
├── script.py # Bash script execution logic
├── templates/ # Prompt templates for LLM
├── requirements.txt # Python dependencies
└── README.md # Documentation
```
## Installation & Usage (Docker)
### Prerequisites
- Docker (for script execution)
- OpenAI API Key (required for LLM interactions)
### Steps
1. **Clone the repository**
```bash
git clone https://github.com/noxs1d/action-model.git
cd action-model
```
2. **Set up environment variables**
```bash
export OPENAI_API_KEY=your_api_key_here
```
3. **Build the Docker image**
```bash
docker build -t action-model .
```
4. **Run the container**
```bash
docker run -p 8000:8000 --env OPENAI_API_KEY=$OPENAI_API_KEY --name action-model action-model
```
The API will be available at `http://localhost:8000/docs`.
## Future Improvements
- 🛡 **Enhance Security**: Implement command whitelisting to prevent dangerous operations.
- 🚀 **Improve Async Handling**: Replace blocking `subprocess.run()` with `asyncio.create_subprocess_exec()`.
- ✅ **Add Unit Tests**: Use `pytest` to ensure API stability.
- 📊 **Implement Logging**: Improve debugging with structured logs.
## Author
[Nurmukhammed](https://github.com/noxs1d)