Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

https://github.com/traceloop/openllmetry-js

Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry
https://github.com/traceloop/openllmetry-js

datascience generative-ai javascript llmops metrics ml model-monitoring monitoring nextjs observability open-source opentelemetry opentelemetry-javascript typescript

Last synced: 7 days ago
JSON representation

Sister project to OpenLLMetry, but in Typescript. Open-source observability for your LLM application, based on OpenTelemetry

Lists

README

        









For Javascript / Typescript



Open-source observability for your LLM application


Get started with Node.js
or Next.js ยป




Slack |
Docs |
Website






OpenLLMetry JS is released under the Apache-2.0 License






PRs welcome!


git commit activity


Slack community channel


Traceloop Twitter

**๐ŸŽ‰ New**:
Our semantic conventions are now part of OpenTelemetry! Join the [discussion](https://github.com/open-telemetry/community/blob/1c71595874e5d125ca92ec3b0e948c4325161c8a/projects/llm-semconv.md) and help us shape the future of LLM observability.

OpenLLMetry-JS is a set of extensions built on top of [OpenTelemetry](https://opentelemetry.io/) that gives you complete observability over your LLM application. Because it uses OpenTelemetry under the hood, it can be connected to your existing observability solutions - Datadog, Honeycomb, and others.

It's built and maintained by Traceloop under the Apache 2.0 license.

The repo contains standard OpenTelemetry instrumentations for LLM providers and Vector DBs, as well as a Traceloop SDK that makes it easy to get started with OpenLLMetry-JS, while still outputting standard OpenTelemetry data that can be connected to your observability stack.
If you already have OpenTelemetry instrumented, you can just add any of our instrumentations directly.

## ๐Ÿš€ Getting Started

The easiest way to get started is to use our SDK.
For a complete guide, go to our [docs](https://traceloop.com/docs/openllmetry/getting-started-ts).

Install the SDK:

```shell
npm install --save @traceloop/node-server-sdk
```

Then, to start instrumenting your code, just add these 2 lines to your code:

```js
import * as traceloop from "@traceloop/node-server-sdk";

traceloop.initialize();
```

Make sure to `import` the SDK before importing any LLM module.

That's it. You're now tracing your code with OpenLLMetry-JS!
If you're running this locally, you may want to disable batch sending, so you can see the traces immediately:

```js
traceloop.initialize({ disableBatch: true });
```

Now, you need to decide where to export the traces to.

## โซ Supported (and tested) destinations

- โœ… [Traceloop](https://www.traceloop.com/docs/openllmetry/integrations/traceloop)
- โœ… [Dynatrace](https://www.traceloop.com/docs/openllmetry/integrations/dynatrace)
- โœ… [Datadog](https://www.traceloop.com/docs/openllmetry/integrations/datadog)
- โœ… [New Relic](https://www.traceloop.com/docs/openllmetry/integrations/newrelic)
- โœ… [Honeycomb](https://www.traceloop.com/docs/openllmetry/integrations/honeycomb)
- โœ… [Grafana Tempo](https://www.traceloop.com/docs/openllmetry/integrations/grafana)
- โœ… [HyperDX](https://www.traceloop.com/docs/openllmetry/integrations/hyperdx)
- โœ… [SigNoz](https://www.traceloop.com/docs/openllmetry/integrations/signoz)
- โœ… [Splunk](https://www.traceloop.com/docs/openllmetry/integrations/splunk)
- โœ… [OpenTelemetry Collector](https://www.traceloop.com/docs/openllmetry/integrations/otel-collector)

See [our docs](https://traceloop.com/docs/openllmetry/integrations/exporting) for instructions on connecting to each one.

## ๐Ÿช— What do we instrument?

OpenLLMetry-JS can instrument everything that [OpenTelemetry already instruments](https://github.com/open-telemetry/opentelemetry-js-contrib/tree/main/plugins/node) - so things like your DB, API calls, and more. On top of that, we built a set of custom extensions that instrument things like your calls to OpenAI or Anthropic, or your Vector DB like Pinecone, Chroma, or Weaviate.

### LLM Providers

- โœ… OpenAI
- โœ… Azure OpenAI
- โœ… Anthropic
- โœ… Cohere
- โณ Replicate
- โณ HuggingFace
- โœ… Vertex AI (GCP)
- โœ… Bedrock (AWS)

### Vector DBs

- โœ… Pinecone
- โœ… Chroma
- โณ Weaviate
- โณ Milvus

### Frameworks

- โœ… LangChain
- โœ… LlamaIndex

## ๐ŸŒฑ Contributing

Whether it's big or small, we love contributions โค๏ธ Check out our guide to see how to [get started](https://traceloop.com/docs/openllmetry/contributing/overview).

Not sure where to get started? You can:

- [Book a free pairing session with one of our teammates](mailto:[email protected]?subject=Pairing%20session&body=I'd%20like%20to%20do%20a%20pairing%20session!)!
- Join our Slack, and ask us any questions there.

## ๐Ÿ’š Community & Support

- [Slack](https://traceloop.com/slack) (For live discussion with the community and the Traceloop team)
- [GitHub Discussions](https://github.com/traceloop/openllmetry-js/discussions) (For help with building and deeper conversations about features)
- [GitHub Issues](https://github.com/traceloop/openllmetry-js/issues) (For any bugs and errors you encounter using OpenLLMetry)
- [Twitter](https://twitter.com/traceloopdev) (Get news fast)