Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/andygeek/multillama-ts
MultiLlama π¦π¦π¦ is a TypeScript framework for using multiple LLMs simultaneously, enabling dynamic decision flows.
https://github.com/andygeek/multillama-ts
ai llm llms
Last synced: 2 months ago
JSON representation
MultiLlama π¦π¦π¦ is a TypeScript framework for using multiple LLMs simultaneously, enabling dynamic decision flows.
- Host: GitHub
- URL: https://github.com/andygeek/multillama-ts
- Owner: andygeek
- License: mit
- Created: 2024-10-12T23:10:48.000Z (3 months ago)
- Default Branch: master
- Last Pushed: 2024-10-20T05:16:32.000Z (3 months ago)
- Last Synced: 2024-10-20T10:02:22.029Z (3 months ago)
- Topics: ai, llm, llms
- Language: TypeScript
- Homepage:
- Size: 1.14 MB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# MultiLlama
MultiLlama π¦π¦π¦ is an innovative TypeScript framework that helps developers use multiple Large Language Models (LLMs) simultaneously. Designed to unify different AI models, MultiLlama enables the creation of dynamic decision flows and manages complex processes, leveraging the strengths of each AI model together.
---
## Supported Services
MultiLlama currently supports the following services:
- β **OpenAI**
- β **Ollama**
- β **Anthropic**
- β³ **Gemini** *(coming soon)*---
## Table of Contents
- [Features](#features)
- [Installation](#installation)
- [Getting Started](#getting-started)
- [Usage Examples](#usage-examples)
- [Basic Usage](#basic-usage)
- [Creating a Pipeline](#creating-a-pipeline)---
## Features
- **Unified Interface**: Interact with multiple language models through a single, consistent API.
- **Pipeline Processing**: Build complex processing pipelines with conditional branching and context management.
- **Extensibility**: Easily add support for new models and services via adapters.
- **Configurable**: Initialize and manage configurations from code or external files.
- **Spinner Integration**: Built-in support for CLI spinners to enhance user experience during processing.---
## Installation
To install MultiLlama, use npm:
```bash
npm install multillama
```---
## Getting Started
First, import the necessary classes and initialize the `MultiLlama` instance with your configuration.
```typescript
import { MultiLlama, OpenAIAdapter, OllamaAdapter } from 'multillama';// Define service configurations
const openaiService = {
adapter: new OpenAIAdapter(),
apiKey: 'your-openai-api-key',
};const ollamaService = {
adapter: new OllamaAdapter(),
};// Define model configurations
const models = {
gpt4: {
service: openaiService,
name: 'gpt-4',
response_format: 'json',
},
llama: {
service: ollamaService,
name: 'llama-2',
response_format: 'text',
},
};// Initialize MultiLlama
MultiLlama.initialize({
services: {
openai: openaiService,
ollama: ollamaService,
},
models,
spinnerConfig: {
loadingMessage: 'Processing...',
successMessage: 'Done!',
errorMessage: 'An error occurred.',
},
});
```---
## Usage Examples
### Basic Usage
Use a specific model to generate a response to a prompt.
```typescript
const multillama = new MultiLlama();async function generateResponse() {
const prompt = 'What is the capital of France?';
const response = await multillama.useModel('gpt4', [{role: 'user', content: prompt}]);
console.log(response);
}generateResponse();
```**Output:**
```
Paris
```### Creating a Pipeline
Create a processing pipeline with conditional steps and branching.
```typescript
import { Pipeline } from 'multillama';async function processInput(userInput: string) {
const multillama = new MultiLlama();
const pipeline = new Pipeline();
pipeline.setEnableLogging(true);// Initial Step: Analyze the input
const initialStep = pipeline.addStep(async (input, context) => {
// Determine the type of question
const analysisPrompt = `Analyze the following question and categorize it: "${input}"`;
const response = await multillama.useModel('gpt4', [{role: 'user', content: analysisPrompt}]);
if (response.includes('weather')) {
return 'weather_question';
} else {
return 'general_question';
}
});// Branch for weather-related questions
const weatherStep = pipeline.addStep(async (input, context) => {
const weatherPrompt = `Provide a weather report for "${context.initialInput}"`;
return await multillama.useModel('gpt4', [{role: 'user', content: weatherPrompt}]);
});// Branch for general questions
const generalStep = pipeline.addStep(async (input, context) => {
return await multillama.useModel('llama', [{role: 'user', content: context.initialInput}]);
});// Set up branching
pipeline.addBranch(initialStep, 'weather_question', weatherStep);
pipeline.addBranch(initialStep, 'general_question', generalStep);// Execute the pipeline
const result = await pipeline.execute(userInput);
console.log(result);
}processInput('What is the weather like in New York?');
```**Output:**
```
The current weather in New York is sunny with a temperature of 25Β°C.
```---
*Happy Coding!* π¦π