https://github.com/enso-labs/llm-server
π€ Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
https://github.com/enso-labs/llm-server
ai anthropic bedrock fastapi groq langchain large-language-models minio ollama openai python redis server vercel
Last synced: 3 months ago
JSON representation
π€ Open-source LLM server (OpenAI, Ollama, Groq, Anthropic) with support for HTTP, Streaming, Agents, RAG
- Host: GitHub
- URL: https://github.com/enso-labs/llm-server
- Owner: enso-labs
- Created: 2023-11-04T15:21:14.000Z (almost 2 years ago)
- Default Branch: development
- Last Pushed: 2025-06-10T22:08:19.000Z (4 months ago)
- Last Synced: 2025-06-10T22:32:27.473Z (4 months ago)
- Topics: ai, anthropic, bedrock, fastapi, groq, langchain, large-language-models, minio, ollama, openai, python, redis, server, vercel
- Language: TypeScript
- Homepage: https://promptengineersai.netlify.app
- Size: 27.9 MB
- Stars: 32
- Watchers: 2
- Forks: 11
- Open Issues: 23
-
Metadata Files:
- Readme: README.md
- Changelog: Changelog.md
Awesome Lists containing this project
README
# π¨ DEPRECATION NOTICE
**β οΈ This repository is no longer maintained.**
As of **June 10, 2025**, this project has been officially deprecated and archived. It will no longer receive updates, security patches, or support from the maintainers.### π Recommended Action:
We recommend transitioning to [EnsΕ Orchestra](https://github.com/enso-labs/orchestra), which serves as the actively maintained and improved successor to this project.---
π€ Prompt Engineers AI - LLM Server
Full LLM REST API with prompts, LLMs, Vector Databases, and Agents
![]()
## π Table of Contents
- [Deploy](https://github.com/promptengineers-ai/llm-server/blob/development/docs/deploy)
- [Tools](https://github.com/promptengineers-ai/llm-server/blob/development/docs/tools)## π οΈ Setup Services
```bash
### Setup Docker Services
docker-compose up --build
```## π οΈ Setup Server
Before running the server make sure to take a look at `cp .example.env .env` see [Environment Variables](https://github.com/promptengineers-ai/llm-server?tab=readme-ov-file#environment-variables).
```bash
### Change into Backend directory
cd backend### Setup Virtual Env
python3 -m venv .venv### Activate Virtual Env
source .venv/bin/activate### Install Runtime & Dev Dependencies
pip install -r requirements.txt -r requirements-dev.txt -c constaints.txt### Install Runtime Dependencies
pip install -r requirements.txt -c constaints.txt### Migrate Database Schema
alembic upgrade head### Seed Database Users
python3 -m src.seeds.users 3### Run Application on local machine
bash scripts/dev.sh
```## π οΈ Setup Client
```bash
### Change into Backend directory
cd frontend### Install node_modules
npm install### Start Development Server
npm run dev
```### Environment Variables
Variable Name
Example
Description
APP_ENV
'development'
Environment where the application is running
APP_VERSION
0.0.1
Version of the application
APP_SECRET
this-is-top-secret
Secret key for the application
APP_WORKERS
1
Number of application workers
APP_ADMIN_EMAIL
admin@example.com
Admin email for the application
APP_ADMIN_PASS
test1234
Admin password for the application
TEST_USER_ID
0000000000000000000000000
Test user ID
DATABASE_URL
mysql+aiomysql://admin:password@localhost:3306/llm_server
URL for the database
PINECONE_API_KEY
API key for Pinecone services
PINECONE_ENV
us-east1-gcp
Pinecone environment configuration
PINECONE_INDEX
default
Default Pinecone index used
REDIS_URL
redis://localhost:6379
URL for the Redis service
OPENAI_API_KEY
sk-abc123...
Default LLM OpenAI key
GROQ_API_KEY
API key for accessing GROQ services
ANTHROPIC_API_KEY
API key for accessing Anthropic services
OLLAMA_BASE_URL
http://localhost:11434
Base URL for the Ollama service
SEARX_SEARCH_HOST_URL
http://localhost:8080
URL for the Searx search service
MINIO_HOST
localhost:9000
URL to the Object storage
BUCKET
my-documents
Name of Minio or S3 bucket
S3_REGION
us-east-1
Region where the S3 bucket exists
ACCESS_KEY_ID
AKIAIOSFODNN7EXAMPLE
IAM User Access Key ID (Optional)
ACCESS_SECRET_KEY
wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
Secret IAM Key (Optional)
## π Roadmap
Here are the upcoming features I'm (ryaneggleston@promptengineers.ai) is excited to bring to Prompt Engineers AI - LLM Server (More to come):
- [x] π€ **Foundation Model Providers Supported (OpenAI, Anthropic, Ollama, Groq, Google... coming soon.)**
- [x] πΈ **Multi-Modal Models Generation**
- [x] π **Retrieval Augmented Generation (RAG)**
- [x] π **UI-Based Tool Configuration**
- [x] π₯ [**Code Interpreter**](https://github.com/promptengineers-ai/llm-server/blob/52b82eee1744d2b9543f788b835082c72fb8869c/backend/src/tools/__init__.py#L89)
- β οΈ Use with Caution. Recommend [E2B Data Analysis](https://python.langchain.com/v0.2/docs/integrations/tools/e2b_data_analysis/)
- [ ] π **Dark Mode**
- [ ] π¨ **Configure Custom Theme and Logos**
- [ ] π€ **Assistant Creation Capability**Create an issue and lets start a discussion if you'd like to see a feature added to the roadmap.
## π€ How to Contribute
We welcome contributions from the community, from beginners to seasoned developers. Here's how you can contribute:
1. **Fork the repository**: Click on the 'Fork' button at the top right corner of the repository page on GitHub.
2. **Clone the forked repository** to your local machine: `git clone `.
3. **Navigate to the project folder**: `cd llm-server`.
4. **Create a new branch** for your changes: `git checkout -b `.
5. **Make your changes** in the new branch.
6. **Commit your changes**: `git commit -am 'Add some feature'`.
7. **Push to the branch**: `git push origin `.
8. **Open a Pull Request**: Go back to your forked repository on GitHub and click on 'Compare & pull request' to create a new pull request.
Please ensure that your code passes all the tests and if possible, add tests for new features. Always write a clear and concise commit message and pull request description.
## π‘ Issues
Feel free to submit issues and enhancement requests. We're always looking for feedback and suggestions.
## π€ Maintainers
- `Ryan Eggleston` - `ryaneggleston@promptengineers.ai`
## π License
This project is open-source, under the [MIT License](LICENSE). Feel free to use, modify, and distribute the code as you please.
Happy Prompting! ππ