https://github.com/zya/litellmjs
JavaScript implementation of LiteLLM.
https://github.com/zya/litellmjs
javascript llama2 llm nodejs ollama openai
Last synced: 10 months ago
JSON representation
JavaScript implementation of LiteLLM.
- Host: GitHub
- URL: https://github.com/zya/litellmjs
- Owner: zya
- License: mit
- Created: 2023-10-03T22:42:27.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2025-03-21T15:04:48.000Z (12 months ago)
- Last Synced: 2025-05-11T01:01:38.471Z (10 months ago)
- Topics: javascript, llama2, llm, nodejs, ollama, openai
- Language: TypeScript
- Homepage:
- Size: 141 KB
- Stars: 124
- Watchers: 2
- Forks: 20
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
🚅 LiteLLM.js
JavaScript implementation of LiteLLM.
# Usage
```
npm install litellm
```
```ts
import { completion } from 'litellm';
process.env['OPENAI_API_KEY'] = 'your-openai-key';
const response = await completion({
model: 'gpt-3.5-turbo',
messages: [{ content: 'Hello, how are you?', role: 'user' }],
});
// or stream the results
const stream = await completion({
model: "gpt-3.5-turbo",
messages: [{ content: "Hello, how are you?", role: "user" }],
stream: true
});
for await (const part of stream) {
process.stdout.write(part.choices[0]?.delta?.content || "");
}
```
# Features
We aim to support all features that [LiteLLM python package](https://github.com/BerriAI/litellm) supports.
* Standardised completions
* Standardised embeddings
* Standardised input params 🚧 - List is [here](/docs/input-params.md)
* Caching ❌
* Proxy ❌
## Supported Providers
| Provider | Completion | Streaming | Embedding
| ------------- | ------------- | ------------- | ------------- |
| [openai](https://docs.litellm.ai/docs/providers/openai) | ✅ | ✅ | ✅ |
| [cohere](https://docs.litellm.ai/docs/providers/cohere) | ✅ | ✅ | ❌ |
| [anthropic](https://docs.litellm.ai/docs/providers/anthropic) | ✅ | ✅ | ❌ |
| [ollama](https://docs.litellm.ai/docs/providers/ollama) | ✅ | ✅ | ✅ |
| [ai21](https://docs.litellm.ai/docs/providers/ai21) | ✅ | ✅ | ❌ |
| [replicate](https://docs.litellm.ai/docs/providers/replicate) | ✅ | ✅ | ❌ |
| [deepinfra](https://docs.litellm.ai/docs/providers/deepinfra) | ✅ | ✅ | ❌ |
| [mistral](https://docs.litellm.ai/docs/providers/mistral) | ✅ | ✅ | ✅ |
| [huggingface](https://docs.litellm.ai/docs/providers/huggingface) | ❌ | ❌ | ❌ |
| [together_ai](https://docs.litellm.ai/docs/providers/togetherai) | ❌ | ❌ | ❌ |
| [openrouter](https://docs.litellm.ai/docs/providers/openrouter) | ❌ | ❌ | ❌ |
| [vertex_ai](https://docs.litellm.ai/docs/providers/vertex) | ❌ | ❌ | ❌ |
| [palm](https://docs.litellm.ai/docs/providers/palm) | ❌ | ❌ | ❌ |
| [baseten](https://docs.litellm.ai/docs/providers/baseten) | ❌ | ❌ | ❌ |
| [azure](https://docs.litellm.ai/docs/providers/azure) | ❌ | ❌ | ❌ |
| [sagemaker](https://docs.litellm.ai/docs/providers/aws_sagemaker) | ❌ | ❌ | ❌ |
| [bedrock](https://docs.litellm.ai/docs/providers/bedrock) | ❌ | ❌ | ❌ |
| [vllm](https://docs.litellm.ai/docs/providers/vllm) | ❌ | ❌ | ❌ |
| [nlp_cloud](https://docs.litellm.ai/docs/providers/nlp_cloud) | ❌ | ❌ | ❌ |
| [aleph alpha](https://docs.litellm.ai/docs/providers/aleph_alpha) | ❌ | ❌ | ❌ |
| [petals](https://docs.litellm.ai/docs/providers/petals) | ❌ | ❌ | ❌ |
# Development
## Clone the repo
```
git clone https://github.com/zya/litellmjs.git
```
## Install dependencies
```
npm install
```
## Run unit tests
```
npm t
```
## Run E2E tests
First copy the example env file.
```
cp .example.env .env
```
Then fill the variables with your API keys to be able to run the E2E tests.
```
OPENAI_API_KEY=
....
```
Then run the command below to run the tests
```
npm run test:e2e
```