https://github.com/traceloop/hub
High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included
https://github.com/traceloop/hub
artificial-intelligence datascience generative-ai llm llmops ml model-monitoring observability open-source opentelemetry rust
Last synced: 4 days ago
JSON representation
High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included
- Host: GitHub
- URL: https://github.com/traceloop/hub
- Owner: traceloop
- License: apache-2.0
- Created: 2024-10-24T13:43:43.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2026-01-11T13:19:04.000Z (about 1 month ago)
- Last Synced: 2026-01-11T16:46:03.927Z (about 1 month ago)
- Topics: artificial-intelligence, datascience, generative-ai, llm, llmops, ml, model-monitoring, observability, open-source, opentelemetry, rust
- Language: Rust
- Homepage: https://www.traceloop.com/docs/hub
- Size: 731 KB
- Stars: 150
- Watchers: 1
- Forks: 28
- Open Issues: 8
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
- awesome-repositories - traceloop/hub - High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included (Rust)
README
Open-source, high-performance LLM gateway written in Rust. Connect to any LLM provider with a single API. Observability Included.
Get started ยป
Slack |
Docs
Hub is a next generation smart proxy for LLM applications. It centralizes control and tracing of all LLM calls and traces.
It's built in Rust so it's fast and efficient. It's completely open-source and free to use.
Built and maintained by Traceloop under the Apache 2.0 license.
## ๐ Getting Started
Make sure to copy a `config.yaml` file from `config-example.yaml` and set the correct values, following the [configuration](https://www.traceloop.com/docs/hub/configuration) instructions.
You can then run the hub using the docker image:
```
docker run --rm -p 3000:3000 -v $(pwd)/config.yaml:/etc/hub/config.yaml:ro -e CONFIG_FILE_PATH='/etc/hub/config.yaml' -t traceloop/hub
```
You can also run it locally. Make sure you have `rust` v1.82 and above installed and then run:
```
cargo run
```
Connect to the hub by using the OpenAI SDK on any language, and setting the base URL to:
```
http://localhost:3000/api/v1
```
For example, in Python:
```
client = OpenAI(
base_url="http://localhost:3000/api/v1",
api_key=os.getenv("OPENAI_API_KEY"),
# default_headers={"x-traceloop-pipeline": "azure-only"},
)
completion = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
max_tokens=1000,
)
```
## ๐ฑ Contributing
Whether big or small, we love contributions โค๏ธ Check out our guide to see how to [get started](https://traceloop.com/docs/hub/contributing/overview).
Not sure where to get started? You can:
- [Book a free pairing session with one of our teammates](mailto:nir@traceloop.com?subject=Pairing%20session&body=I'd%20like%20to%20do%20a%20pairing%20session!)!
- Join our Slack, and ask us any questions there.
## ๐ Community & Support
- [Slack](https://traceloop.com/slack) (For live discussion with the community and the Traceloop team)
- [GitHub Discussions](https://github.com/traceloop/hub/discussions) (For help with building and deeper conversations about features)
- [GitHub Issues](https://github.com/traceloop/hub/issues) (For any bugs and errors you encounter using OpenLLMetry)
- [Twitter](https://twitter.com/traceloopdev) (Get news fast)