An open API service indexing awesome lists of open source software.

https://github.com/aryaminus/dspy-proxy

A REST API server for DSPy that enables language model programming from any language. This proxy exposes DSPy's core functionality - signature registration, module execution, optimization, and evaluation - through HTTP endpoints, making it accessible from non-Python environments.
https://github.com/aryaminus/dspy-proxy

ai chain-of-thought dspy fastapi llm-optimization machine-learning microservice nlp prompt-engineering proxy-server rest-api

Last synced: 4 months ago
JSON representation

A REST API server for DSPy that enables language model programming from any language. This proxy exposes DSPy's core functionality - signature registration, module execution, optimization, and evaluation - through HTTP endpoints, making it accessible from non-Python environments.

Awesome Lists containing this project

README

          

# DSPy Proxy Server

A proxy server for [DSPy](https://github.com/stanfordnlp/dspy) that allows you to register signatures, execute modules, and run optimization procedures via a REST API. This enables using DSPy from languages other than Python. Idea seed from [tweet](https://x.com/skylar_b_payne/status/1990808733140779488)!

[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://render.com/deploy?repo=https://github.com/aryaminus/dspy-proxy)

## Features

- **Register Signatures**: Define input/output signatures dynamically.
- **Execute Modules**: Run `Predict`, `ChainOfThought`, or `ProgramOfThought` modules.
- **Optimize**: Compile modules using `BootstrapFewShot`, `MIPROv2`, or `COPRO`.
- **Evaluate**: Run evaluation on your modules using `dspy.Evaluate`.
- **Configure LM**: Set up the Language Model (OpenAI supported).

## Installation

1. Clone the repository.
2. Create a virtual environment and install dependencies:
```bash
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
```

## Usage

Start the server:
```bash
python main.py
```

The server runs on `http://0.0.0.0:8000`.

## Deployment

### Option 1: One-Click Deploy

[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://render.com/deploy?repo=https://github.com/aryaminus/dspy-proxy)

### Option 2: Manual Deployment

This project is configured for deployment on [Render](https://render.com).

1. Fork this repository.
2. Create a new **Web Service** on Render.
3. Connect your repository.
4. Render will automatically detect the `render.yaml` configuration.
5. Add your `OPENAI_API_KEY` in the Environment Variables settings on Render.

## API Endpoints

### 1. Configure LM
`POST /configure`
```json
{
"provider": "openai",
"model": "gpt-4o-mini",
"api_key": "sk-..."
}
```
Supported providers: `openai`, `anthropic`, `google`, `azure`, etc. (any provider supported by `litellm`).
If `api_key` is omitted, the server will look for environment variables like `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc.

### 2. Register Signature
`POST /register`
```json
{
"name": "qa",
"signature": "question -> answer",
"instructions": "Answer the question concisely."
}
```

### 3. Predict
`POST /predict`
```json
{
"signature_name": "qa",
"inputs": {
"question": "What is the capital of France?"
},
"module_type": "ChainOfThought"
}
```
Supported `module_type`: `Predict`, `ChainOfThought`, `ProgramOfThought`.

### 4. Optimize
`POST /optimize`
```json
{
"signature_name": "qa",
"train_data": [
{"question": "What is 1+1?", "answer": "2"},
{"question": "What is 2+2?", "answer": "4"}
],
"metric": "exact_match",
"optimizer": "BootstrapFewShot",
"max_bootstraps": 4
}
```
Supported `optimizer`: `BootstrapFewShot`, `MIPROv2`, `COPRO`.
Returns a `module_id`.

### 5. Predict with Optimized Module
`POST /predict`
```json
{
"signature_name": "qa",
"inputs": {
"question": "What is 3+3?"
},
"compiled_module_id": "qa_opt_0"
}
```

### 6. Evaluate
`POST /evaluate`
```json
{
"signature_name": "qa",
"test_data": [
{"question": "What is 5+5?", "answer": "10"}
],
"metric": "exact_match",
"compiled_module_id": "qa_opt_0"
}
```
Returns a `score`.