https://github.com/tomconte/codebox-ai
A secure Python code execution service designed to integrate with LLMs like GPT and Claude, providing a self-hosted alternative to OpenAI's Code Interpreter
https://github.com/tomconte/codebox-ai
Last synced: 9 months ago
JSON representation
A secure Python code execution service designed to integrate with LLMs like GPT and Claude, providing a self-hosted alternative to OpenAI's Code Interpreter
- Host: GitHub
- URL: https://github.com/tomconte/codebox-ai
- Owner: tomconte
- License: mit
- Created: 2024-12-20T11:46:27.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2025-01-30T17:09:02.000Z (11 months ago)
- Last Synced: 2025-02-14T11:34:18.777Z (11 months ago)
- Language: Python
- Size: 5.43 MB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# CodeBox-AI
A secure Python code execution service that provides a self-hosted alternative to OpenAI's Code Interpreter. Built with FastAPI and IPython kernels, it supports session-based code execution and integrates with LLM function calling.
## Features
- Session-based Python code execution in Docker containers
- IPython kernel for rich output support
- Dynamic package installation with security controls
- Package allowlist/blocklist system
- Version control for security vulnerabilities
- Support for pip and conda installations
- State persistence between executions
- Support for plotting and visualization
- Code security validation
- AST-based code analysis
- Protection against dangerous imports and operations
- Support for Jupyter magic commands and shell operations
## Prerequisites
- Python 3.9+
- Docker
## Installation
1. Clone the repository:
```bash
git clone https://github.com/yourusername/codebox-ai.git
cd codebox-ai
```
2. Install dependencies:
```bash
python -m venv venv
source venv/bin/activate # On Windows: .\venv\Scripts\activate
pip install -r requirements.txt
```
3. Start the server:
```bash
python -m codeboxai.main
```
The API will be available at `http://localhost:8000`
### Docker "file not found" error
If you encounter a "file not found" `DockerException` when running the server on MacOS, you might need to set the `DOCKER_HOST` environment variable. First, find out which context you are using by running:
```bash
docker context ls
```
Then set the `DOCKER_HOST` environment variable to the correct endpoint:
```bash
export DOCKER_HOST="unix:///Users/tconte/.docker/run/docker.sock"
```
## Usage
### Direct API Usage
1. Create a new session:
```bash
curl -X POST http://localhost:8000/sessions \
-H "Content-Type: application/json" \
-d '{
"dependencies": ["numpy", "pandas"]
}'
```
2. Execute code in the session:
```bash
curl -X POST http://localhost:8000/execute \
-H "Content-Type: application/json" \
-d '{
"code": "x = 42\nprint(f\"Value of x: {x}\")",
"session_id": "YOUR_SESSION_ID"
}'
```
3. Execute more code in the same session:
```bash
curl -X POST http://localhost:8000/execute \
-H "Content-Type: application/json" \
-d '{
"code": "print(f\"x is still: {x}\")",
"session_id": "YOUR_SESSION_ID"
}'
```
### OpenAI GPT Integration Example
An example script is provided to demonstrate integration with OpenAI's GPT models.
1. Create a `.env` file in the project root:
```
AZURE_OPENAI_ENDPOINT=https://xxx.cognitiveservices.azure.com/
AZURE_OPENAI_API_KEY=foo
AZURE_OPENAI_DEPLOYMENT=gpt-4o
OPENAI_API_VERSION=2024-05-01-preview
```
2. Install additional requirements:
```bash
pip install -r examples/requirements.txt
```
3. Run the example:
```bash
python examples/example_openai.py
```
This will start an interactive session where you can chat with GPT-4 and have it execute Python code. The script maintains state between executions, so variables and imports persist across interactions.

## API Endpoints
- `POST /sessions` - Create a new session
- `POST /execute` - Execute code in a session
- `GET /execute/{request_id}/status` - Get execution status
- `GET /execute/{request_id}/results` - Get execution results
- `DELETE /sessions/{session_id}` - Cleanup a session
## Security Notes
- Code execution is containerized using Docker
- Each session runs in an isolated environment
- Basic resource limits are implemented
- Network access is available but can be restricted
- Input code validation is implemented for basic security
## License
MIT License - See LICENSE file for details.
## A Note on Authorship
This code was pair-programmed with Claude 3.5 Sonnet (yes, an AI helping to build tools for other AIs - very meta). While I handled the product decisions and architecture reviews, Claude did most of the heavy lifting in terms of code generation and documentation. Even this README was written by Claude, which makes this acknowledgment a bit like an AI writing about an AI writing about AI tools... we need to go deeper 🤖✨
Humans were (mostly) present during the development process. No AIs were harmed in the making of this project, though a few might have gotten slightly dizzy from the recursion.
---
A prototype implementation, not intended for production use without additional security measures.