An open API service indexing awesome lists of open source software.

https://github.com/enso-labs/llm-server

πŸ€– Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
https://github.com/enso-labs/llm-server

ai anthropic bedrock fastapi groq langchain large-language-models minio ollama openai python redis server vercel

Last synced: 3 months ago
JSON representation

πŸ€– Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG

Awesome Lists containing this project

README

          

# 🚨 DEPRECATION NOTICE

**⚠️ This repository is no longer maintained.**
As of **June 10, 2025**, this project has been officially deprecated and archived. It will no longer receive updates, security patches, or support from the maintainers.

### πŸ‘‰ Recommended Action:
We recommend transitioning to [Ensō Orchestra](https://github.com/enso-labs/orchestra), which serves as the actively maintained and improved successor to this project.

---


πŸ€– Prompt Engineers AI - LLM Server


Full LLM REST API with prompts, LLMs, Vector Databases, and Agents






## πŸ“– Table of Contents

- [Deploy](https://github.com/promptengineers-ai/llm-server/blob/development/docs/deploy)
- [Tools](https://github.com/promptengineers-ai/llm-server/blob/development/docs/tools)

## πŸ› οΈ Setup Services
```bash
### Setup Docker Services
docker-compose up --build
```

## πŸ› οΈ Setup Server

Before running the server make sure to take a look at `cp .example.env .env` see [Environment Variables](https://github.com/promptengineers-ai/llm-server?tab=readme-ov-file#environment-variables).

```bash
### Change into Backend directory
cd backend

### Setup Virtual Env
python3 -m venv .venv

### Activate Virtual Env
source .venv/bin/activate

### Install Runtime & Dev Dependencies
pip install -r requirements.txt -r requirements-dev.txt -c constaints.txt

### Install Runtime Dependencies
pip install -r requirements.txt -c constaints.txt

### Migrate Database Schema
alembic upgrade head

### Seed Database Users
python3 -m src.seeds.users 3

### Run Application on local machine
bash scripts/dev.sh
```

## πŸ› οΈ Setup Client
```bash
### Change into Backend directory
cd frontend

### Install node_modules
npm install

### Start Development Server
npm run dev
```

### Environment Variables


Variable Name
Example
Description


APP_ENV
'development'
Environment where the application is running


APP_VERSION
0.0.1
Version of the application


APP_SECRET
this-is-top-secret
Secret key for the application


APP_WORKERS
1
Number of application workers


APP_ADMIN_EMAIL
admin@example.com
Admin email for the application


APP_ADMIN_PASS
test1234
Admin password for the application


TEST_USER_ID
0000000000000000000000000
Test user ID


DATABASE_URL
mysql+aiomysql://admin:password@localhost:3306/llm_server
URL for the database


PINECONE_API_KEY

API key for Pinecone services


PINECONE_ENV
us-east1-gcp
Pinecone environment configuration


PINECONE_INDEX
default
Default Pinecone index used


REDIS_URL
redis://localhost:6379
URL for the Redis service


OPENAI_API_KEY
sk-abc123...
Default LLM OpenAI key


GROQ_API_KEY

API key for accessing GROQ services


ANTHROPIC_API_KEY

API key for accessing Anthropic services


OLLAMA_BASE_URL
http://localhost:11434
Base URL for the Ollama service


SEARX_SEARCH_HOST_URL
http://localhost:8080
URL for the Searx search service


MINIO_HOST
localhost:9000
URL to the Object storage


BUCKET
my-documents
Name of Minio or S3 bucket


S3_REGION
us-east-1
Region where the S3 bucket exists


ACCESS_KEY_ID
AKIAIOSFODNN7EXAMPLE
IAM User Access Key ID (Optional)


ACCESS_SECRET_KEY
wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Secret IAM Key (Optional)

## πŸš€ Roadmap

Here are the upcoming features I'm (ryaneggleston@promptengineers.ai) is excited to bring to Prompt Engineers AI - LLM Server (More to come):

- [x] πŸ€– **Foundation Model Providers Supported (OpenAI, Anthropic, Ollama, Groq, Google... coming soon.)**
- [x] πŸ“Έ **Multi-Modal Models Generation**
- [x] πŸ“‘ **Retrieval Augmented Generation (RAG)**
- [x] πŸ›  **UI-Based Tool Configuration**
- [x] πŸ–₯ [**Code Interpreter**](https://github.com/promptengineers-ai/llm-server/blob/52b82eee1744d2b9543f788b835082c72fb8869c/backend/src/tools/__init__.py#L89)
- ⚠️ Use with Caution. Recommend [E2B Data Analysis](https://python.langchain.com/v0.2/docs/integrations/tools/e2b_data_analysis/)
- [ ] πŸŒ‘ **Dark Mode**
- [ ] 🎨 **Configure Custom Theme and Logos**
- [ ] πŸ€– **Assistant Creation Capability**

Create an issue and lets start a discussion if you'd like to see a feature added to the roadmap.

## 🀝 How to Contribute

We welcome contributions from the community, from beginners to seasoned developers. Here's how you can contribute:

1. **Fork the repository**: Click on the 'Fork' button at the top right corner of the repository page on GitHub.

2. **Clone the forked repository** to your local machine: `git clone `.

3. **Navigate to the project folder**: `cd llm-server`.

4. **Create a new branch** for your changes: `git checkout -b `.

5. **Make your changes** in the new branch.

6. **Commit your changes**: `git commit -am 'Add some feature'`.

7. **Push to the branch**: `git push origin `.

8. **Open a Pull Request**: Go back to your forked repository on GitHub and click on 'Compare & pull request' to create a new pull request.

Please ensure that your code passes all the tests and if possible, add tests for new features. Always write a clear and concise commit message and pull request description.

## πŸ’‘ Issues

Feel free to submit issues and enhancement requests. We're always looking for feedback and suggestions.

## πŸ€“ Maintainers

- `Ryan Eggleston` - `ryaneggleston@promptengineers.ai`

## πŸ“œ License

This project is open-source, under the [MIT License](LICENSE). Feel free to use, modify, and distribute the code as you please.

Happy Prompting! πŸŽ‰πŸŽ‰