Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jrhizor/elelem
Simple, opinionated, JSON-typed, and traced LLM framework for TypeScript.
https://github.com/jrhizor/elelem
ai cohere llm openai
Last synced: about 1 month ago
JSON representation
Simple, opinionated, JSON-typed, and traced LLM framework for TypeScript.
- Host: GitHub
- URL: https://github.com/jrhizor/elelem
- Owner: jrhizor
- License: mit
- Created: 2023-09-12T01:32:42.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-03-11T22:01:53.000Z (8 months ago)
- Last Synced: 2024-10-06T09:44:30.084Z (about 1 month ago)
- Topics: ai, cohere, llm, openai
- Language: TypeScript
- Homepage: https://www.mealbymeal.com
- Size: 87.9 KB
- Stars: 35
- Watchers: 2
- Forks: 3
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
![Elelem](images/elelem.png)
# Elelem
Elelem is a simple, opinionated, JSON-typed, and traced LLM framework in TypeScript.## Why another LLM library?
In September 2023, I tried to port [MealByMeal](https://mealbymeal.com/), a production LLM-based application, over to LangChain.
Caching wasn't supported for chat-based endpoints (specifically for `gpt-3.5-turbo`).
Additionally, the interface for interacting with these endpoints felt quite awkward.
Since then, LangChain Expression Language (LCEL) was introduced, but handling and enforcing typed outputs is still repetitive and error-prone.
Furthermore, without leveraging `gpt-4`, the structured outputs are seldom valid.
For debugging nuances like retries and parsing errors, the in-built tracing leaves much to be desired.
All of these issues led me to create my own lightweight library.## How does Elelem compare to LangChain?
| Feature | Elelem | Langchain |
|-----------------------------------------------|--------|-----------|
| TypeScript library | ✅ | ✅ |
| OpenAI generation support | ✅ | ✅ |
| Cohere generation support | ✅ | ✅ |
| Anthropic generation support | ✅ | ✅ |
| Emphasis on typed LLM outputs | ✅ | ❌ |
| Easily composable multi-step LLM workflows | ✅ | ❌ |
| Convenient API for single chat completions | ✅ | ❌ |
| Caching for OpenAI chat endpoints | ✅ | ❌ |
| OpenTelemetry support | ✅ | ❌ |
| Autogenerated JSON examples in prompts | ✅ | ❌ |
| Python library | ❌ | ✅ |
| Support for many models (Claude, Llama, etc.) | ❌ | ✅ |
| Vector store support | ❌ | ✅ |
| A million other features | ❌ | ✅ |## Example
Install with `npm install elelem` or `yarn add elelem`.
You'll need `yarn add zod openai` and `yarn add ioredis` if you're using Redis for caching (see `src/elelem.test.ts` for an example of setting up caching).
Usage:
```typescript
import { z } from "zod";
import OpenAI from "openai";
import { elelem, JsonSchemaAndExampleFormatter } from "elelem";const capitolResponseSchema = z.object({
capitol: z.string(),
});const cityResponseSchema = z.object({
foundingYear: z.string(),
populationEstimate: z.number(),
});const llm = elelem.init({
openai: new OpenAI({ apiKey: process.env.OPENAI_API_KEY })
});const inputCountry = "USA";
(async () => {
const { result, usage } = await llm.session(
"capitol-info-retriever",
{ openai: { model: "gpt-3.5-turbo" } },
async (c) => {
const { result: capitol } = await c.openai(
"get-capitol",
{ max_tokens: 100, temperature: 0 },
`What is the capitol of the country provided?`,
inputCountry,
capitolResponseSchema,
JsonSchemaAndExampleFormatter,
);const { result: cityDescription } = await c.openai(
"city-description",
{ max_tokens: 100, temperature: 0 },
`For the given capitol city, return the founding year and an estimate of the population of the city.`,
capitol.capitol,
cityResponseSchema,
JsonSchemaAndExampleFormatter,
);return cityDescription;
},
);console.log(result);
// { foundingYear: '1790', populationEstimate: 705749 }console.log(usage);
// {
// completion_tokens: 26,
// prompt_tokens: 695,
// total_tokens: 721,
// cost_usd: 0.0010945
// }
})();
```## Viewing Traces on Jaeger
Start [Jaeger](https://www.jaegertracing.io/) locally using:
```
docker run --rm --name jaeger \
-e COLLECTOR_ZIPKIN_HOST_PORT=:9411 \
-p 6831:6831/udp \
-p 6832:6832/udp \
-p 5778:5778 \
-p 16686:16686 \
-p 4317:4317 \
-p 4318:4318 \
-p 14250:14250 \
-p 14268:14268 \
-p 14269:14269 \
-p 9411:9411 \
jaegertracing/all-in-one:1.49
```Allow publishing traces to Jaeger with the following:
```typescript
import * as opentelemetry from "@opentelemetry/sdk-node";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";const sdk = new opentelemetry.NodeSDK({
serviceName: "your-service-name",
traceExporter: new OTLPTraceExporter(),
});
sdk.start();// rest of your code...
process.on('SIGTERM', () => {
sdk.shutdown()
.then(() => console.log('Tracing terminated'))
.catch((error) => console.log('Error terminating tracing', error))
.finally(() => process.exit(0));
});
```When you run your code, your traces will be available at http://localhost:16686/.
### What do the traces look like in Jaeger?
![Exploring Traces in Jaeger](https://i.imgur.com/1DGWd6O.gif)
### Tracing in Production
See the [OpenTelemetry docs](https://opentelemetry.io/docs/instrumentation/js/exporters/) for more information on sending traces to hosted instances of Zipkin, Jaeger, Datadog, etc.
## Contributing
Pull requests are welcome. For major changes, please open an issue first
to discuss what you would like to change.Please make sure to update tests as appropriate.
## For Contributors: Running Integration Tests
To run tests, first make sure you have Git, Yarn, and Docker installed. Then checkout the repo and install dependencies:
```
git clone [email protected]:jrhizor/elelem.git
cd elelem
yarn install
```Create a `.env` file:
```
OPENAI_API_KEY=
REDIS=redis://localhost:6379
```Start up Redis:
```
docker run -it -p 6379:6379 redis
```Start up Jaeger:
```
docker run --rm --name jaeger \
-e COLLECTOR_ZIPKIN_HOST_PORT=:9411 \
-p 6831:6831/udp \
-p 6832:6832/udp \
-p 5778:5778 \
-p 16686:16686 \
-p 4317:4317 \
-p 4318:4318 \
-p 14250:14250 \
-p 14268:14268 \
-p 14269:14269 \
-p 9411:9411 \
jaegertracing/all-in-one:1.49
```Now you're ready to run the unit and integration tests:
```
yarn test
```## License
[MIT](https://choosealicense.com/licenses/mit/)