https://github.com/inferra/inferra-js-sdk
Official JavaScript/TypeScript SDK for Inferra API access
https://github.com/inferra/inferra-js-sdk
ai api-client async batch-processing inferra javascript language-models llama llm machine-learning mistral nodejs openai-compatible sdk typescript
Last synced: 2 months ago
JSON representation
Official JavaScript/TypeScript SDK for Inferra API access
- Host: GitHub
- URL: https://github.com/inferra/inferra-js-sdk
- Owner: Inferra
- License: mit
- Created: 2025-02-10T23:35:15.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-02-20T15:02:16.000Z (4 months ago)
- Last Synced: 2025-04-05T14:12:04.825Z (2 months ago)
- Topics: ai, api-client, async, batch-processing, inferra, javascript, language-models, llama, llm, machine-learning, mistral, nodejs, openai-compatible, sdk, typescript
- Language: TypeScript
- Homepage:
- Size: 72.3 KB
- Stars: 1
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Inferra JavaScript SDK
The official JavaScript SDK for Inferra.net - Access leading open source AI models with just a few lines of code.
## Installation
```bash
npm install @inferra/sdk
# or
yarn add @inferra/sdk
```## Quick Start
```typescript
import { InferraClient } from '@inferra/sdk';const client = new InferraClient({
apiKey: 'your-api-key'
});// Create a chat completion
const response = await client.chat.create({
model: 'meta-llama/llama-3.1-8b-instruct/fp-8',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What is the meaning of life?' }
],
stream: true
});// Process streaming response
for await (const chunk of response) {
if (chunk.choices[0]?.delta?.content) {
process.stdout.write(chunk.choices[0].delta.content);
}
}
```## Features
- Full support for Inferra's API
- Written in TypeScript with complete type definitions
- Built-in rate limiting and retries
- Streaming support
- Batch processing
- Comprehensive documentation## Available Models
| Model Name | Price (per 1M tokens) |
|------------|----------------------|
| meta-llama/llama-3.2-1b-instruct/fp-8 | $0.015 |
| meta-llama/llama-3.2-3b-instruct/fp-8 | $0.03 |
| meta-llama/llama-3.1-8b-instruct/fp-8 | $0.045 |
| meta-llama/llama-3.1-8b-instruct/fp-16 | $0.05 |
| mistralai/mistral-nemo-12b-instruct/fp-8 | $0.10 |
| meta-llama/llama-3.1-70b-instruct/fp-8 | $0.30 |## Development
```bash
# Install dependencies
npm install# Run tests
npm test# Run linting
npm run lint# Build
npm run build
```