An open API service indexing awesome lists of open source software.

https://github.com/traceloop/hub

High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included
https://github.com/traceloop/hub

artificial-intelligence datascience generative-ai llm llmops ml model-monitoring observability open-source opentelemetry rust

Last synced: 4 days ago
JSON representation

High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included

Awesome Lists containing this project

README

          










Open-source, high-performance LLM gateway written in Rust. Connect to any LLM provider with a single API. Observability Included.


Get started ยป




Slack |
Docs






Traceloop Hub is released under the Apache-2.0 License





git commit activity



PRs welcome!


Slack community channel


Traceloop Twitter

Hub is a next generation smart proxy for LLM applications. It centralizes control and tracing of all LLM calls and traces.
It's built in Rust so it's fast and efficient. It's completely open-source and free to use.

Built and maintained by Traceloop under the Apache 2.0 license.

## ๐Ÿš€ Getting Started

Make sure to copy a `config.yaml` file from `config-example.yaml` and set the correct values, following the [configuration](https://www.traceloop.com/docs/hub/configuration) instructions.

You can then run the hub using the docker image:

```
docker run --rm -p 3000:3000 -v $(pwd)/config.yaml:/etc/hub/config.yaml:ro -e CONFIG_FILE_PATH='/etc/hub/config.yaml' -t traceloop/hub
```

You can also run it locally. Make sure you have `rust` v1.82 and above installed and then run:

```
cargo run
```

Connect to the hub by using the OpenAI SDK on any language, and setting the base URL to:

```
http://localhost:3000/api/v1
```

For example, in Python:

```
client = OpenAI(
base_url="http://localhost:3000/api/v1",
api_key=os.getenv("OPENAI_API_KEY"),
# default_headers={"x-traceloop-pipeline": "azure-only"},
)
completion = client.chat.completions.create(
model="claude-3-5-sonnet-20241022",
messages=[{"role": "user", "content": "Tell me a joke about opentelemetry"}],
max_tokens=1000,
)
```

## ๐ŸŒฑ Contributing

Whether big or small, we love contributions โค๏ธ Check out our guide to see how to [get started](https://traceloop.com/docs/hub/contributing/overview).

Not sure where to get started? You can:

- [Book a free pairing session with one of our teammates](mailto:nir@traceloop.com?subject=Pairing%20session&body=I'd%20like%20to%20do%20a%20pairing%20session!)!
- Join our Slack, and ask us any questions there.

## ๐Ÿ’š Community & Support

- [Slack](https://traceloop.com/slack) (For live discussion with the community and the Traceloop team)
- [GitHub Discussions](https://github.com/traceloop/hub/discussions) (For help with building and deeper conversations about features)
- [GitHub Issues](https://github.com/traceloop/hub/issues) (For any bugs and errors you encounter using OpenLLMetry)
- [Twitter](https://twitter.com/traceloopdev) (Get news fast)