https://github.com/arize-ai/openinference
OpenTelemetry Instrumentation for AI Observability
https://github.com/arize-ai/openinference
aiops hacktoberfest haystack langchain langraph llamaindex llmops llms openai openai-agents opentelemetry smolagents telemetry tracing vercel vertex
Last synced: 12 days ago
JSON representation
OpenTelemetry Instrumentation for AI Observability
- Host: GitHub
- URL: https://github.com/arize-ai/openinference
- Owner: Arize-ai
- License: apache-2.0
- Created: 2023-12-26T17:33:58.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-04-08T16:31:21.000Z (16 days ago)
- Last Synced: 2025-04-09T08:42:25.286Z (15 days ago)
- Topics: aiops, hacktoberfest, haystack, langchain, langraph, llamaindex, llmops, llms, openai, openai-agents, opentelemetry, smolagents, telemetry, tracing, vercel, vertex
- Language: Python
- Homepage: https://arize-ai.github.io/openinference/
- Size: 5.65 MB
- Stars: 367
- Watchers: 10
- Forks: 76
- Open Issues: 99
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Codeowners: .github/CODEOWNERS
- Security: SECURITY
Awesome Lists containing this project
README
![]()
OpenInference is a set of conventions and plugins that is complimentary to [OpenTelemetry](https://opentelemetry.io/) to enable tracing of AI applications. OpenInference is natively supported by [arize-phoenix](https://github.com/Arize-ai/phoenix), but can be used with any OpenTelemetry-compatible backend as well.
## Specification
The OpenInference specification is edited in markdown files found in the [spec directory](./spec/). It's designed to provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic, and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.
## Instrumentation
OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of languages.
## Python
### Libraries
| Package | Description | Version |
| --------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [`openinference-semantic-conventions`](./python/openinference-semantic-conventions) | Semantic conventions for tracing of LLM Apps. | [](https://pypi.python.org/pypi/openinference-semantic-conventions) |
| [`openinference-instrumentation-openai`](./python/instrumentation/openinference-instrumentation-openai) | OpenInference Instrumentation for OpenAI SDK. | [](https://pypi.python.org/pypi/openinference-instrumentation-openai) |
| [`openinference-instrumentation-openai-agents`](./python/instrumentation/openinference-instrumentation-openai-agents) | OpenInference Instrumentation for OpenAI Agents SDK. | [](https://pypi.python.org/pypi/openinference-instrumentation-openai-agents) |
| [`openinference-instrumentation-llama-index`](./python/instrumentation/openinference-instrumentation-llama-index) | OpenInference Instrumentation for LlamaIndex. | [](https://pypi.python.org/pypi/openinference-instrumentation-llama-index) |
| [`openinference-instrumentation-dspy`](./python/instrumentation/openinference-instrumentation-dspy) | OpenInference Instrumentation for DSPy. | [](https://pypi.python.org/pypi/openinference-instrumentation-dspy) |
| [`openinference-instrumentation-bedrock`](./python/instrumentation/openinference-instrumentation-bedrock) | OpenInference Instrumentation for AWS Bedrock. | [](https://pypi.python.org/pypi/openinference-instrumentation-bedrock) |
| [`openinference-instrumentation-langchain`](./python/instrumentation/openinference-instrumentation-langchain) | OpenInference Instrumentation for LangChain. | [](https://pypi.python.org/pypi/openinference-instrumentation-langchain) |
| [`openinference-instrumentation-mistralai`](./python/instrumentation/openinference-instrumentation-mistralai) | OpenInference Instrumentation for MistralAI. | [](https://pypi.python.org/pypi/openinference-instrumentation-mistralai) |
| [`openinference-instrumentation-guardrails`](./python/instrumentation/openinference-instrumentation-guardrails) | OpenInference Instrumentation for Guardrails. | [](https://pypi.python.org/pypi/openinference-instrumentation-guardrails) |
| [`openinference-instrumentation-vertexai`](./python/instrumentation/openinference-instrumentation-vertexai) | OpenInference Instrumentation for VertexAI. | [](https://pypi.python.org/pypi/openinference-instrumentation-vertexai) |
| [`openinference-instrumentation-crewai`](./python/instrumentation/openinference-instrumentation-crewai) | OpenInference Instrumentation for CrewAI. | [](https://pypi.python.org/pypi/openinference-instrumentation-crewai) |
| [`openinference-instrumentation-haystack`](./python/instrumentation/openinference-instrumentation-haystack) | OpenInference Instrumentation for Haystack. | [](https://pypi.python.org/pypi/openinference-instrumentation-haystack) |
| [`openinference-instrumentation-litellm`](./python/instrumentation/openinference-instrumentation-litellm) | OpenInference Instrumentation for liteLLM. | [](https://pypi.python.org/pypi/openinference-instrumentation-litellm) |
| [`openinference-instrumentation-groq`](./python/instrumentation/openinference-instrumentation-groq) | OpenInference Instrumentation for Groq. | [](https://pypi.python.org/pypi/openinference-instrumentation-groq) |
| [`openinference-instrumentation-instructor`](./python/instrumentation/openinference-instrumentation-instructor) | OpenInference Instrumentation for Instructor. | [](https://pypi.python.org/pypi/openinference-instrumentation-instructor) |
| [`openinference-instrumentation-anthropic`](./python/instrumentation/openinference-instrumentation-anthropic) | OpenInference Instrumentation for Anthropic. | [](https://pypi.python.org/pypi/openinference-instrumentation-anthropic) |### Examples
| Name | Description | Complexity Level |
| ----------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------- | ---------------- |
| [OpenAI SDK](python/instrumentation/openinference-instrumentation-openai/examples/) | OpenAI Python SDK, including chat completions and embeddings | Beginner |
| [MistralAI SDK](python/instrumentation/openinference-instrumentation-mistralai/examples/) | MistralAI Python SDK | Beginner |
| [VertexAI SDK](python/instrumentation/openinference-instrumentation-vertexai/examples/) | VertexAI Python SDK | Beginner |
| [LlamaIndex](python/instrumentation/openinference-instrumentation-llama-index/examples/) | LlamaIndex query engines | Beginner |
| [DSPy](python/instrumentation/openinference-instrumentation-dspy/examples/) | DSPy primitives and custom RAG modules | Beginner |
| [Boto3 Bedrock Client](python/instrumentation/openinference-instrumentation-bedrock/examples/) | Boto3 Bedrock client | Beginner |
| [LangChain](python/instrumentation/openinference-instrumentation-langchain/examples/) | LangChain primitives and simple chains | Beginner |
| [LiteLLM](python/instrumentation/openinference-instrumentation-litellm/) | A lightweight LiteLLM framework | Beginner |
| [LiteLLM Proxy](python/instrumentation/openinference-instrumentation-litellm/examples/litellm-proxy/) | LiteLLM Proxy to log OpenAI, Azure, Vertex, Bedrock | Beginner |
| [Groq](python/instrumentation/openinference-instrumentation-groq/examples/) | Groq and AsyncGroq chat completions | Beginner |
| [Anthropic](python/instrumentation/openinference-instrumentation-anthropic/examples/) | Anthropic Messages client | Beginner |
| [LlamaIndex + Next.js Chatbot](python/examples/llama-index/) | A fully functional chatbot using Next.js and a LlamaIndex FastAPI backend | Intermediate |
| [LangServe](python/examples/langserve/) | A LangChain application deployed with LangServe using custom metadata on a per-request basis | Intermediate |
| [DSPy](python/examples/dspy-rag-fastapi/) | A DSPy RAG application using FastAPI, Weaviate, and Cohere | Intermediate |
| [Haystack](python/instrumentation/openinference-instrumentation-haystack/examples/) | A Haystack QA RAG application | Intermediate |
| [OpenAI Agents](python/instrumentation/openinference-instrumentation-openai-agents/examples/) | OpenAI Agents with handoffs | Intermediate |## JavaScript
### Libraries
| Package | Description | Version |
| ----------------------------------------------------------------------------------------------------------- | ----------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| [`@arizeai/openinference-semantic-conventions`](./js/packages/openinference-semantic-conventions) | Semantic conventions for tracing of LLM Apps. | [](https://www.npmjs.com/package/@arizeai/openinference-semantic-conventions) |
| [`@arizeai/openinference-core`](./js/packages/openinference-core) | Core utility functions for instrumentation | [](https://www.npmjs.com/package/@arizeai/openinference-core) |
| [`@arizeai/openinference-instrumentation-openai`](./js/packages/openinference-instrumentation-openai) | OpenInference Instrumentation for OpenAI SDK. | [](https://www.npmjs.com/package/@arizeai/openinference-instrumentation-openai) |
| [`@arizeai/openinference-instrumentation-langchain`](./js/packages/openinference-instrumentation-langchain) | OpenInference Instrumentation for LangChain.js. | [](https://www.npmjs.com/package/@arizeai/openinference-instrumentation-langchain) |
| [`@arizeai/openinference-instrumentation-beeai`](./js/packages/openinference-instrumentation-beeai) | OpenInference Instrumentation for BeeAI. | [](https://www.npmjs.com/package/@arizeai/openinference-instrumentation-beeai) |
| [`@arizeai/openinference-vercel`](./js/packages/openinference-vercel) | OpenInference Support for Vercel AI SDK | [](https://www.npmjs.com/package/@arizeai/openinference-vercel) |### Examples
| Name | Description | Complexity Level |
| -------------------------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------- |
| [OpenAI SDK](js/examples/openai) | OpenAI Node.js client | Beginner |
| [BeeAI framework - ReAct agent](js/packages/openinference-instrumentation-beeai/examples/run-react-agent.ts) | Agentic `ReActAgent` instrumentation in the BeeAI framework | Beginner |
| [BeeAI framework - ToolCalling agent](js/packages/openinference-instrumentation-beeai/examples/run-toolcalling-agent.ts) | Agentic `ToolCallingAgent` instrumentation in the BeeAI framework | Beginner |
| [BeeAI framework - LLM](js/packages/openinference-instrumentation-beeai/examples/run-llm.ts) | See how to run instrumentation only for the specific LLM module part in the BeeAI framework | Beginner |
| [LlamaIndex Express App](js/examples/llama-index-express) | A fully functional LlamaIndex chatbot with a Next.js frontend and a LlamaIndex Express backend, instrumented using `openinference-instrumentation-openai` | Intermediate |
| [LangChain OpenAI](js/packages/openinference-instrumentation-langchain/examples) | A simple script to call OpenAI via LangChain, instrumented using `openinference-instrumentation-langchain` | Beginner |
| [LangChain RAG Express App](js/examples/langchain-express) | A fully functional LangChain chatbot that uses RAG to answer user questions. It has a Next.js frontend and a LangChain Express backend, instrumented using `openinference-instrumentation-langchain` | Intermediate |
| [Next.js + OpenAI](js/examples/nextjs-openai-simple/) | A Next.js 13 project bootstrapped with `create-next-app` that uses OpenAI to generate text | Beginner |## Supported Destinations
OpenInference supports the following destinations as span collectors.
- β [Arize-Phoenix](https://github.com/Arize-ai/phoenix)
- β [Arize](https://arize.com/)
- β Any OTEL-compatible collector## Community
Join our community to connect with thousands of machine learning practitioners and LLM observability enthusiasts!
- π Join our [Slack community](https://arize-ai.slack.com/join/shared_invite/zt-11t1vbu4x-xkBIHmOREQnYnYDH1GDfCg?__hstc=259489365.a667dfafcfa0169c8aee4178d115dc81.1733501603539.1733501603539.1733501603539.1&__hssc=259489365.1.1733501603539&__hsfp=3822854628&submissionGuid=381a0676-8f38-437b-96f2-fc10875658df#/shared-invite/email).
- π‘ Ask questions and provide feedback in the _#phoenix-support_ channel.
- π Leave a star on our [GitHub](https://github.com/Arize-ai/openinference).
- π Report bugs with [GitHub Issues](https://github.com/Arize-ai/openinference/issues).
- π Follow us on [X](https://twitter.com/ArizePhoenix).
- πΊοΈ Check out our [roadmap](https://github.com/orgs/Arize-ai/projects/45) to see where we're heading next.