Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/lmnr-ai/lmnr
Laminar - Open-source DataDog + PostHog for AI agents / RAG apps. Fast, reliable and insightful. Written in Rust 🦀. YC S24.
https://github.com/lmnr-ai/lmnr
agents ai aiops analytics blazingly-fast developer-tools llm-evaluation llm-observability llm-workflow monitoring observability open-source pipeline-builder rag rust-lang self-hosted
Last synced: about 1 month ago
JSON representation
Laminar - Open-source DataDog + PostHog for AI agents / RAG apps. Fast, reliable and insightful. Written in Rust 🦀. YC S24.
- Host: GitHub
- URL: https://github.com/lmnr-ai/lmnr
- Owner: lmnr-ai
- License: apache-2.0
- Created: 2024-08-29T03:45:28.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2024-09-26T19:01:46.000Z (about 2 months ago)
- Last Synced: 2024-09-29T21:02:33.531Z (about 2 months ago)
- Topics: agents, ai, aiops, analytics, blazingly-fast, developer-tools, llm-evaluation, llm-observability, llm-workflow, monitoring, observability, open-source, pipeline-builder, rag, rust-lang, self-hosted
- Language: TypeScript
- Homepage: https://www.lmnr.ai
- Size: 6.73 MB
- Stars: 780
- Watchers: 5
- Forks: 26
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE.md
Awesome Lists containing this project
- awesome-llmops - Laminar - source all-in-one platform for engineering AI products. Traces, Evals, Datasets, Labels. | ![GitHub Badge](https://img.shields.io/github/stars/lmnr-ai/lmnr.svg?style=flat-square) | (LLMOps / Observability)
README
![Static Badge](https://img.shields.io/badge/Y%20Combinator-S24-orange)
![X (formerly Twitter) Follow](https://img.shields.io/twitter/follow/lmnrai)
![Static Badge](https://img.shields.io/badge/Join_Discord-464646?&logo=discord&logoColor=5865F2)# Laminar - LLM engineering from first principles
Laminar is an open-source platform for engineering LLM products. Trace, evaluate, annotate, and analyze LLM data. Bring LLM applications to production with confidence.
Think of it as DataDog + PostHog for LLM apps.
- OpenTelemetry-based instrumentation: automatic for LLM / vector DB calls with just 2 lines of code + decorators to track functions (powered by an amazing [OpenLLMetry](https://github.com/traceloop/openllmetry) open-source package by TraceLoop).
- Semantic events-based analytics. Laminar hosts background job queues of LLM pipelines. Outputs of those pipelines are turned into metrics. For example, you can design a pipeline which extracts "my AI drive-through agent made an upsell" data, and track this metric in Laminar.
- Built for scale with a modern stack: written in Rust, RabbitMQ for message queue, Postgres for data, Clickhouse for analytics
- Insightful, fast dashboards for traces / spans / eventsRead the [docs](https://docs.lmnr.ai).
This is a work in progress repo and it will be frequently updated.
## Getting started
### Laminar Cloud
The easiest way to get started is with a generous free tier on our managed platform -> [lmnr.ai](https://www.lmnr.ai)### Self-hosting with Docker compose
Start local version with docker compose.
```sh
git clone [email protected]:lmnr-ai/lmnr
cd lmnr
docker compose up
```This will spin up the following containers:
- app-server – the core app logic, backend, and the LLM proxies
- rabbitmq – message queue for sending the traces and observations reliably
- qdrant – vector database
- semantic-search-service – service for interacting with qdrant and embeddings
- frontend – the visual front-end dashboard for interacting with traces
- postgres – the database for all the application data
- clickhouse – columnar OLAP database for more efficient event and trace analytics#### Local development
The simple set up above will pull latest Laminar images from Github Container Registry.
If you want to test your local changes, you will need to build from source using [Local docker compose](./docker-compose-local-dev.yml)
```sh
docker compose -f docker-compose-local-dev.yml up --build
```### Instrumenting Python code
First, create a project and generate a Project API Key. Then,
```sh
pip install lmnr
echo "LMNR_PROJECT_API_KEY=" >> .env
```To automatically instrument LLM calls of popular frameworks and LLM provider libraries just add
```python
from lmnr import Laminar as L
L.initialize(project_api_key="")
```In addition to automatic instrumentation, we provide a simple `@observe()` decorator, if you want to trace inputs / outputs of functions
#### Example```python
import os
from openai import OpenAIfrom lmnr import observe, Laminar as L
L.initialize(project_api_key="")client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
@observe() # annotate all functions you want to trace
def poem_writer(topic="turbulence"):
prompt = f"write a poem about {topic}"
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt},
],
)
poem = response.choices[0].message.content
return poemif __name__ == "__main__":
print(poem_writer(topic="laminar flow"))
```#### Sending events
To send an evant, call `L.event(name, value)`.
Read our [docs](https://docs.lmnr.ai) to learn more about events and how they are created.
```python
from lmnr import Laminar as L
# ...
poem = response.choices[0].message.content# this will register True or False value with Laminar
L.event("topic alignment", topic in poem)
```#### Laminar pipelines as prompt chain managers
You can create Laminar pipelines in the UI and manage chains of LLM calls there.
After you are ready to use your pipeline in your code, deploy it in Laminar by selecting the target version for the pipeline.
Once your pipeline target is set, you can call it from Python in just a few lines.
```python
from lmnr import Laminar as LL.initialize('')
result = l.run(
pipeline = 'my_pipeline_name',
inputs = {'input_node_name': 'some_value'},
# all environment variables
env = {'OPENAI_API_KEY': 'sk-some-key'},
)
```## Learn more
To learn more about instrumenting your code, check out our client libraries:
![NPM Version](https://img.shields.io/npm/v/%40lmnr-ai%2Flmnr?label=lmnr&logo=npm&logoColor=CB3837)
![PyPI - Version](https://img.shields.io/pypi/v/lmnr?label=lmnr&logo=pypi&logoColor=3775A9)To get deeper understanding of the concepts, follow on to the [docs](https://docs.lmnr.ai/) and [tutorials](https://docs.lmnr.ai/tutorials).