Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/traceloop/openllmetry-js
Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
https://github.com/traceloop/openllmetry-js
datascience generative-ai javascript llmops metrics ml model-monitoring monitoring nextjs observability open-source opentelemetry opentelemetry-javascript typescript
Last synced: 1 day ago
JSON representation
Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
- Host: GitHub
- URL: https://github.com/traceloop/openllmetry-js
- Owner: traceloop
- License: apache-2.0
- Created: 2023-09-27T22:38:28.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-01-13T13:13:49.000Z (12 days ago)
- Last Synced: 2025-01-15T23:38:11.762Z (10 days ago)
- Topics: datascience, generative-ai, javascript, llmops, metrics, ml, model-monitoring, monitoring, nextjs, observability, open-source, opentelemetry, opentelemetry-javascript, typescript
- Language: TypeScript
- Homepage: https://www.traceloop.com/openllmetry
- Size: 6.07 MB
- Stars: 282
- Watchers: 1
- Forks: 29
- Open Issues: 49
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-observability - OpenLLMetry for Javascript - Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry. (3. Collect / Metrics)
README
For Javascript / Typescript
Open-source observability for your LLM application
Get started with Node.js
or Next.js ยป
Slack |
Docs |
Website
**๐ New**:
Our semantic conventions are now part of OpenTelemetry! Join the [discussion](https://github.com/open-telemetry/community/blob/1c71595874e5d125ca92ec3b0e948c4325161c8a/projects/llm-semconv.md) and help us shape the future of LLM observability.OpenLLMetry-JS is a set of extensions built on top of [OpenTelemetry](https://opentelemetry.io/) that gives you complete observability over your LLM application. Because it uses OpenTelemetry under the hood, it can be connected to your existing observability solutions - Datadog, Honeycomb, and others.
It's built and maintained by Traceloop under the Apache 2.0 license.
The repo contains standard OpenTelemetry instrumentations for LLM providers and Vector DBs, as well as a Traceloop SDK that makes it easy to get started with OpenLLMetry-JS, while still outputting standard OpenTelemetry data that can be connected to your observability stack.
If you already have OpenTelemetry instrumented, you can just add any of our instrumentations directly.## ๐ Getting Started
The easiest way to get started is to use our SDK.
For a complete guide, go to our [docs](https://traceloop.com/docs/openllmetry/getting-started-ts).Install the SDK:
```shell
npm install --save @traceloop/node-server-sdk
```Then, to start instrumenting your code, just add these 2 lines to your code:
```js
import * as traceloop from "@traceloop/node-server-sdk";traceloop.initialize();
```Make sure to `import` the SDK before importing any LLM module.
That's it. You're now tracing your code with OpenLLMetry-JS!
If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:```js
traceloop.initialize({ disableBatch: true });
```Now, you need to decide where to export the traces to.
## โซ Supported (and tested) destinations
- โ [Traceloop](https://www.traceloop.com/docs/openllmetry/integrations/traceloop)
- โ [Dynatrace](https://www.traceloop.com/docs/openllmetry/integrations/dynatrace)
- โ [Datadog](https://www.traceloop.com/docs/openllmetry/integrations/datadog)
- โ [New Relic](https://www.traceloop.com/docs/openllmetry/integrations/newrelic)
- โ [Honeycomb](https://www.traceloop.com/docs/openllmetry/integrations/honeycomb)
- โ [Grafana Tempo](https://www.traceloop.com/docs/openllmetry/integrations/grafana)
- โ [HyperDX](https://www.traceloop.com/docs/openllmetry/integrations/hyperdx)
- โ [SigNoz](https://www.traceloop.com/docs/openllmetry/integrations/signoz)
- โ [Splunk](https://www.traceloop.com/docs/openllmetry/integrations/splunk)
- โ [OpenTelemetry Collector](https://www.traceloop.com/docs/openllmetry/integrations/otel-collector)See [our docs](https://traceloop.com/docs/openllmetry/integrations/exporting) for instructions on connecting to each one.
## ๐ช What do we instrument?
OpenLLMetry-JS can instrument everything that [OpenTelemetry already instruments](https://github.com/open-telemetry/opentelemetry-js-contrib/tree/main/plugins/node) - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Pinecone, Chroma, or Weaviate.
### LLM Providers
- โ OpenAI
- โ Azure OpenAI
- โ Anthropic
- โ Cohere
- โณ Replicate
- โณ HuggingFace
- โ Vertex AI (GCP)
- โ Bedrock (AWS)### Vector DBs
- โ Pinecone
- โ Chroma
- โ Qdrant
- โณ Weaviate
- โณ Milvus### Frameworks
- โ LangChain
- โ LlamaIndex## ๐ Telemetry
The SDK provided with OpenLLMetry (not the instrumentations) contains a telemetry feature that collects **anonymous** usage information.
You can opt out of telemetry by setting the `TRACELOOP_TELEMETRY` environment variable to `FALSE`.
### Why we collect telemetry
- The primary purpose is to detect exceptions within instrumentations. Since LLM providers frequently update their APIs, this helps us quickly identify and fix any breaking changes.
- We only collect anonymous data, with no personally identifiable information. You can view exactly what data we collect in our [Privacy documentation](https://www.traceloop.com/docs/openllmetry/privacy/telemetry).
- Telemetry is only collected in the SDK. If you use the instrumentations directly without the SDK, no telemetry is collected.## ๐ฑ Contributing
Whether it's big or small, we love contributions โค๏ธ Check out our guide to see how to [get started](https://traceloop.com/docs/openllmetry/contributing/overview).
Not sure where to get started? You can:
- [Book a free pairing session with one of our teammates](mailto:[email protected]?subject=Pairing%20session&body=I'd%20like%20to%20do%20a%20pairing%20session!)!
- Join our Slack, and ask us any questions there.## ๐ Community & Support
- [Slack](https://traceloop.com/slack) (For live discussion with the community and the Traceloop team)
- [GitHub Discussions](https://github.com/traceloop/openllmetry-js/discussions) (For help with building and deeper conversations about features)
- [GitHub Issues](https://github.com/traceloop/openllmetry-js/issues) (For any bugs and errors you encounter using OpenLLMetry)
- [Twitter](https://twitter.com/traceloopdev) (Get news fast)