Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/traceloop/openllmetry
Open-source observability for your LLM application, based on OpenTelemetry
https://github.com/traceloop/openllmetry
artifical-intelligence datascience generative-ai good-first-issue good-first-issues help-wanted llm llmops metrics ml model-monitoring monitoring observability open-source open-telemetry opentelemetry opentelemetry-python python
Last synced: 5 days ago
JSON representation
Open-source observability for your LLM application, based on OpenTelemetry
- Host: GitHub
- URL: https://github.com/traceloop/openllmetry
- Owner: traceloop
- License: apache-2.0
- Created: 2023-09-02T14:42:59.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-11-14T07:25:46.000Z (2 months ago)
- Last Synced: 2024-11-15T09:47:05.863Z (2 months ago)
- Topics: artifical-intelligence, datascience, generative-ai, good-first-issue, good-first-issues, help-wanted, llm, llmops, metrics, ml, model-monitoring, monitoring, observability, open-source, open-telemetry, opentelemetry, opentelemetry-python, python
- Language: Python
- Homepage: https://www.traceloop.com/openllmetry
- Size: 25.5 MB
- Stars: 3,447
- Watchers: 8
- Forks: 678
- Open Issues: 115
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Security: SECURITY.md
- Governance: GOVERNANCE.md
Awesome Lists containing this project
- awesome-observability - OpenLLMetry - Open-source observability for your LLM application, based on OpenTelemetry. (3. Collect / Metrics)
- awesome-langchain-zh - Openllmetry
- awesome-langchain - Openllmetry - source observability for your LLM application, based on OpenTelemetry ![GitHub Repo stars](https://img.shields.io/github/stars/traceloop/openllmetry?style=social) (Tools / Platforms)
- StarryDivineSky - traceloop/openllmetry
- Awesome-LLM-RAG-Application - openllmetry
- awesome-repositories - traceloop/openllmetry - Open-source observability for your LLM application, based on OpenTelemetry (Python)
- Awesome-LLMOps - OpenLLMetry - source observability for your LLM application, based on OpenTelemetry | | (Observation)
- Awesome-LLMOps - OpenLLMetry - source observability for your LLM application, based on OpenTelemetry | | (Observation)
README
Open-source observability for your LLM application
Get started ยป
Slack |
Docs |
Website
**๐ New**:
Our semantic conventions are now part of OpenTelemetry! Join the [discussion](https://github.com/open-telemetry/community/blob/1c71595874e5d125ca92ec3b0e948c4325161c8a/projects/llm-semconv.md) and help us shape the future of LLM observability.Looking for the JS/TS version? Check out [OpenLLMetry-JS](https://github.com/traceloop/openllmetry-js).
OpenLLMetry is a set of extensions built on top of [OpenTelemetry](https://opentelemetry.io/) that gives you complete observability over your LLM application. Because it uses OpenTelemetry under the hood, [it can be connected to your existing observability solutions](https://www.traceloop.com/docs/openllmetry/integrations/introduction) - Datadog, Honeycomb, and others.
It's built and maintained by Traceloop under the Apache 2.0 license.
The repo contains standard OpenTelemetry instrumentations for LLM providers and Vector DBs, as well as a Traceloop SDK that makes it easy to get started with OpenLLMetry, while still outputting standard OpenTelemetry data that can be connected to your observability stack.
If you already have OpenTelemetry instrumented, you can just add any of our instrumentations directly.## ๐ Getting Started
The easiest way to get started is to use our SDK.
For a complete guide, go to our [docs](https://traceloop.com/docs/openllmetry/getting-started-python).Install the SDK:
```bash
pip install traceloop-sdk
```Then, to start instrumenting your code, just add this line to your code:
```python
from traceloop.sdk import TraceloopTraceloop.init()
```That's it. You're now tracing your code with OpenLLMetry!
If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:```python
Traceloop.init(disable_batch=True)
```## โซ Supported (and tested) destinations
- โ [Traceloop](https://www.traceloop.com/docs/openllmetry/integrations/traceloop)
- โ [Axiom](https://www.traceloop.com/docs/openllmetry/integrations/axiom)
- โ [Azure Application Insights](https://www.traceloop.com/docs/openllmetry/integrations/azure)
- โ [Braintrust](https://www.traceloop.com/docs/openllmetry/integrations/braintrust)
- โ [Dash0](https://www.traceloop.com/docs/openllmetry/integrations/dash0)
- โ [Datadog](https://www.traceloop.com/docs/openllmetry/integrations/datadog)
- โ [Dynatrace](https://www.traceloop.com/docs/openllmetry/integrations/dynatrace)
- โ [Google Cloud](https://www.traceloop.com/docs/openllmetry/integrations/gcp)
- โ [Grafana](https://www.traceloop.com/docs/openllmetry/integrations/grafana)
- โ [Highlight](https://www.traceloop.com/docs/openllmetry/integrations/highlight)
- โ [Honeycomb](https://www.traceloop.com/docs/openllmetry/integrations/honeycomb)
- โ [HyperDX](https://www.traceloop.com/docs/openllmetry/integrations/hyperdx)
- โ [IBM Instana](https://www.traceloop.com/docs/openllmetry/integrations/instana)
- โ [KloudMate](https://www.traceloop.com/docs/openllmetry/integrations/kloudmate)
- โ [New Relic](https://www.traceloop.com/docs/openllmetry/integrations/newrelic)
- โ [OpenTelemetry Collector](https://www.traceloop.com/docs/openllmetry/integrations/otel-collector)
- โ [Service Now Cloud Observability](https://www.traceloop.com/docs/openllmetry/integrations/service-now)
- โ [SigNoz](https://www.traceloop.com/docs/openllmetry/integrations/signoz)
- โ [Sentry](https://www.traceloop.com/docs/openllmetry/integrations/sentry)
- โ [Splunk](https://www.traceloop.com/docs/openllmetry/integrations/splunk)See [our docs](https://traceloop.com/docs/openllmetry/integrations/exporting) for instructions on connecting to each one.
## ๐ช What do we instrument?
OpenLLMetry can instrument everything that [OpenTelemetry already instruments](https://github.com/open-telemetry/opentelemetry-python-contrib/tree/main/instrumentation) - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Chroma, Pinecone, Qdrant or Weaviate.
- โ [OpenAI / Azure OpenAI](https://openai.com/)
- โ [Anthropic](https://www.anthropic.com/)
- โ [Cohere](https://cohere.com/)
- โ [Ollama](https://ollama.com/)
- โ [Mistral AI](https://mistral.ai/)
- โ [HuggingFace](https://huggingface.co/)
- โ [Bedrock (AWS)](https://aws.amazon.com/bedrock/)
- โ [SageMaker (AWS)](https://aws.amazon.com/sagemaker/)
- โ [Replicate](https://replicate.com/)
- โ [Vertex AI (GCP)](https://cloud.google.com/vertex-ai)
- โ [Google Generative AI (Gemini)](https://ai.google/)
- โ [IBM Watsonx AI](https://www.ibm.com/watsonx)
- โ [Together AI](https://together.xyz/)
- โ [Aleph Alpha](https://www.aleph-alpha.com/)
- โ [Groq](https://groq.com/)### Vector DBs
- โ [Chroma](https://www.trychroma.com/)
- โ [Pinecone](https://www.pinecone.io/)
- โ [Qdrant](https://qdrant.tech/)
- โ [Weaviate](https://weaviate.io/)
- โ [Milvus](https://milvus.io/)
- โ [Marqo](https://marqo.ai/)
- โ [LanceDB](https://lancedb.com/)### Frameworks
- โ [LangChain](https://python.langchain.com/docs/introduction/)
- โ [LlamaIndex](https://docs.llamaindex.ai/en/stable/module_guides/observability/observability.html#openllmetry)
- โ [Haystack](https://haystack.deepset.ai/integrations/traceloop)
- โ [LiteLLM](https://docs.litellm.ai/docs/observability/opentelemetry_integration)## ๐ Telemetry
The SDK provided with OpenLLMetry (not the instrumentations) contains a telemetry feature that collects **anonymous** usage information.
You can opt out of telemetry by setting the `TRACELOOP_TELEMETRY` environment variable to `FALSE`, or passing `telemetry_enabled=False` to the `Traceloop.init()` function.
### Why we collect telemetry
- The primary purpose is to detect exceptions within instrumentations. Since LLM providers frequently update their APIs, this helps us quickly identify and fix any breaking changes.
- We only collect anonymous data, with no personally identifiable information. You can view exactly what data we collect in our [Privacy documentation](https://www.traceloop.com/docs/openllmetry/privacy/telemetry).
- Telemetry is only collected in the SDK. If you use the instrumentations directly without the SDK, no telemetry is collected.## ๐ฑ Contributing
Whether big or small, we love contributions โค๏ธ Check out our guide to see how to [get started](https://traceloop.com/docs/openllmetry/contributing/overview).
Not sure where to get started? You can:
- [Book a free pairing session with one of our teammates](mailto:[email protected]?subject=Pairing%20session&body=I'd%20like%20to%20do%20a%20pairing%20session!)!
- Join our Slack, and ask us any questions there.## ๐ Community & Support
- [Slack](https://traceloop.com/slack) (For live discussion with the community and the Traceloop team)
- [GitHub Discussions](https://github.com/traceloop/openllmetry/discussions) (For help with building and deeper conversations about features)
- [GitHub Issues](https://github.com/traceloop/openllmetry/issues) (For any bugs and errors you encounter using OpenLLMetry)
- [Twitter](https://twitter.com/traceloopdev) (Get news fast)## ๐ Special Thanks
To @patrickdebois, who [suggested the great name](https://x.com/patrickdebois/status/1695518950715473991?s=46&t=zn2SOuJcSVq-Pe2Ysevzkg) we're now using for this repo!
## ๐ซ Contributors