Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/traceloop/go-openllmetry
Sister project to OpenLLMetry, but in Go. Open-source observability for your LLM application, based on OpenTelemetry
https://github.com/traceloop/go-openllmetry
datascience generative-ai golang llm llmops metrics ml monitoring observability open-source open-telemetry opentelemetry
Last synced: about 1 month ago
JSON representation
Sister project to OpenLLMetry, but in Go. Open-source observability for your LLM application, based on OpenTelemetry
- Host: GitHub
- URL: https://github.com/traceloop/go-openllmetry
- Owner: traceloop
- License: apache-2.0
- Created: 2024-01-11T14:53:13.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-04-05T18:11:15.000Z (7 months ago)
- Last Synced: 2024-09-27T07:01:42.745Z (about 2 months ago)
- Topics: datascience, generative-ai, golang, llm, llmops, metrics, ml, monitoring, observability, open-source, open-telemetry, opentelemetry
- Language: Go
- Homepage: https://www.traceloop.com/openllmetry
- Size: 47.9 KB
- Stars: 7
- Watchers: 2
- Forks: 1
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-observability - OpenLLMetry for Go - Sister project to OpenLLMetry, but in Go. Open-source observability for your LLM application, based on OpenTelemetry. (3. Collect / Metrics)
README
For Go
Open-source observability for your LLM application
Get started ยป
Slack |
Docs |
Website
OpenLLMetry is a set of extensions built on top of [OpenTelemetry](https://opentelemetry.io/) that gives you complete observability over your LLM application. Because it uses OpenTelemetry under the hood, it can be connected to your existing observability solutions - Datadog, Honeycomb, and others.
It's built and maintained by Traceloop under the Apache 2.0 license.
The repo contains standard OpenTelemetry instrumentations for LLM providers and Vector DBs, as well as a Traceloop SDK that makes it easy to get started with OpenLLMetry, while still outputting standard OpenTelemetry data that can be connected to your observability stack.
If you already have OpenTelemetry instrumented, you can just add any of our instrumentations directly.## ๐ Getting Started
The easiest way to get started is to use our SDK.
For a complete guide, go to our [docs](https://traceloop.com/docs/openllmetry/getting-started-go).Install the SDK:
```bash
go get github.com/traceloop/go-openllmetry/traceloop-sdk
```Then, initialize the SDK in your code:
```go
package mainimport (
"context"sdk "github.com/traceloop/go-openllmetry/traceloop-sdk"
)func main() {
ctx := context.Background()traceloop := sdk.NewClient(ctx, sdk.Config{
APIKey: os.Getenv("TRACELOOP_API_KEY"),
})
defer func() { traceloop.Shutdown(ctx) }()
}
```That's it. You're now tracing your code with OpenLLMetry!
Now, you need to decide where to export the traces to.
## โซ Supported (and tested) destinations
- [x] [Traceloop](https://www.traceloop.com/docs/openllmetry/integrations/traceloop)
- [x] [Dynatrace](https://www.traceloop.com/docs/openllmetry/integrations/dynatrace)
- [x] [Datadog](https://www.traceloop.com/docs/openllmetry/integrations/datadog)
- [x] [New Relic](https://www.traceloop.com/docs/openllmetry/integrations/newrelic)
- [x] [Honeycomb](https://www.traceloop.com/docs/openllmetry/integrations/honeycomb)
- [x] [Grafana Tempo](https://www.traceloop.com/docs/openllmetry/integrations/grafana)
- [x] [HyperDX](https://www.traceloop.com/docs/openllmetry/integrations/hyperdx)
- [x] [SigNoz](https://www.traceloop.com/docs/openllmetry/integrations/signoz)
- [x] [OpenTelemetry Collector](https://www.traceloop.com/docs/openllmetry/integrations/otel-collector)See [our docs](https://traceloop.com/docs/openllmetry/integrations/exporting) for instructions on connecting to each one.
## ๐ช What do we instrument?
OpenLLMetry is in early-alpha exploratory stage, and we're still figuring out what to instrument.
As opposed to other languages, there aren't many official LLM libraries (yet?), so for now you'll have to manually log prompts:```go
package mainimport (
"context"
"fmt"
"os""github.com/sashabaranov/go-openai"
sdk "github.com/traceloop/go-openllmetry/traceloop-sdk"
)func main() {
ctx := context.Background()// Initialize Traceloop
traceloop := sdk.NewClient(ctx, config.Config{
APIKey: os.Getenv("TRACELOOP_API_KEY"),
})
defer func() { traceloop.Shutdown(ctx) }()// Call OpenAI like you normally would
resp, err := client.CreateChatCompletion(
context.Background(),
openai.ChatCompletionRequest{
Model: openai.GPT3Dot5Turbo,
Messages: []openai.ChatCompletionMessage{
{
Role: openai.ChatMessageRoleUser,
Content: "Tell me a joke about OpenTelemetry!",
},
},
},
)var promptMsgs []sdk.Message
for i, message := range request.Messages {
promptMsgs = append(promptMsgs, sdk.Message{
Index: i,
Content: message.Content,
Role: message.Role,
})
}// Log the request
llmSpan, err := traceloop.LogPrompt(
ctx,
sdk.Prompt{
Vendor: "openai",
Mode: "chat",
Model: request.Model,
Messages: promptMsgs,
},
sdk.TraceloopAttributes{
WorkflowName: "example-workflow",
EntityName: "example-entity",
},
)
if err != nil {
fmt.Printf("LogPrompt error: %v\n", err)
return
}client := openai.NewClient(os.Getenv("OPENAI_API_KEY"))
resp, err := client.CreateChatCompletion(
context.Background(),
*request,
)
if err != nil {
fmt.Printf("ChatCompletion error: %v\n", err)
return
}var completionMsgs []sdk.Message
for _, choice := range resp.Choices {
completionMsgs = append(completionMsgs, sdk.Message{
Index: choice.Index,
Content: choice.Message.Content,
Role: choice.Message.Role,
})
}// Log the response
llmSpan.LogCompletion(ctx, sdk.Completion{
Model: resp.Model,
Messages: completionMsgs,
}, sdk.Usage{
TotalTokens: resp.Usage.TotalTokens,
CompletionTokens: resp.Usage.CompletionTokens,
PromptTokens: resp.Usage.PromptTokens,
})
}
```## ๐ฑ Contributing
Whether it's big or small, we love contributions โค๏ธ Check out our guide to see how to [get started](https://traceloop.com/docs/openllmetry/contributing/overview).
Not sure where to get started? You can:
- [Book a free pairing session with one of our teammates](mailto:[email protected]?subject=Pairing%20session&body=I'd%20like%20to%20do%20a%20pairing%20session!)!
- Join our Slack, and ask us any questions there.## ๐ Community & Support
- [Slack](https://join.slack.com/t/traceloopcommunity/shared_invite/zt-1plpfpm6r-zOHKI028VkpcWdobX65C~g) (For live discussion with the community and the Traceloop team)
- [GitHub Discussions](https://github.com/traceloop/go-openllmetry/discussions) (For help with building and deeper conversations about features)
- [GitHub Issues](https://github.com/traceloop/go-openllmetry/issues) (For any bugs and errors you encounter using OpenLLMetry)
- [Twitter](https://twitter.com/traceloopdev) (Get news fast)## ๐ Special Thanks
To @patrickdebois, who [suggested the great name](https://x.com/patrickdebois/status/1695518950715473991?s=46&t=zn2SOuJcSVq-Pe2Ysevzkg) we're now using for this repo!