An open API service indexing awesome lists of open source software.

https://github.com/ksylvest/omniai-mistral

An implementation of the OmniAI interface for Mistral.
https://github.com/ksylvest/omniai-mistral

lechat mistral omniai ruby

Last synced: 3 days ago
JSON representation

An implementation of the OmniAI interface for Mistral.

Awesome Lists containing this project

README

          

# OmniAI::Mistral

[![LICENSE](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/ksylvest/omniai-mistral/blob/main/LICENSE)
[![RubyGems](https://img.shields.io/gem/v/omniai-mistral)](https://rubygems.org/gems/omniai-mistral)
[![GitHub](https://img.shields.io/badge/github-repo-blue.svg)](https://github.com/ksylvest/omniai-mistral)
[![Yard](https://img.shields.io/badge/docs-site-blue.svg)](https://omniai-mistral.ksylvest.com)
[![CircleCI](https://img.shields.io/circleci/build/github/ksylvest/omniai-mistral)](https://circleci.com/gh/ksylvest/omniai-mistral)

An Mistral implementation of the [OmniAI](https://github.com/ksylvest/omniai) APIs.

## Installation

```sh
gem install omniai-mistral
```

## Usage

### Client

A client is setup as follows if `ENV['MISTRAL_API_KEY']` exists:

```ruby
client = OmniAI::Mistral::Client.new
```

A client may also be passed the following options:

- `api_key` (required - default is `ENV['MISTRAL_API_KEY']`)
- `host` (optional)

### Configuration

Global configuration is supported for the following options:

```ruby
OmniAI::Mistral.configure do |config|
config.api_key = 'sk-...' # default: ENV['MISTRAL_API_KEY']
config.host = '...' # default: 'https://api.mistral.ai'
end
```

### Chat

A chat completion is generated by passing in prompts using any a variety of formats:

```ruby
completion = client.chat('Tell me a joke!')
completion.text # 'Why did the chicken cross the road? To get to the other side.'
```

```ruby
completion = client.chat do |prompt|
prompt.system('You are a helpful assistant.')
prompt.user('What is the capital of Canada?')
end
completion.text # 'The capital of Canada is Ottawa.'
```

#### Model

`model` takes an optional string (default is `mistral-medium-latest`):

```ruby
completion = client.chat('Provide code for fibonacci', model: OmniAI::Mistral::Chat::Model::CODESTRAL)
completion.text # 'def fibonacci(n)...end'
```

[Mistral API Reference `model`](https://docs.mistral.ai/getting-started/models/)

#### Temperature

`temperature` takes an optional float between `0.0` and `1.0` (defaults is `0.7`):

```ruby
completion = client.chat('Pick a number between 1 and 5', temperature: 1.0)
completion.text # '3'
```

[Mistral API Reference `temperature`](https://docs.mistral.ai/api/)

#### Stream

`stream` takes an optional a proc to stream responses in real-time chunks instead of waiting for a complete response:

```ruby
stream = proc do |chunk|
print(chunk.text) # 'Better', 'three', 'hours', ...
end
client.chat('Be poetic.', stream:)
```

[Mistral API Reference `stream`](https://docs.mistral.ai/api/)

#### Format

`format` takes an optional symbol (`:json`) and that sets the `response_format` to `json_object`:

```ruby
completion = client.chat(format: :json) do |prompt|
prompt.system(OmniAI::Chat::JSON_PROMPT)
prompt.user('What is the name of the drummer for the Beatles?')
end
JSON.parse(completion.text) # { "name": "Ringo" }
```

[Mistral API Reference `response_format`](https://docs.mistral.ai/api/)

> When using JSON mode you MUST also instruct the model to produce JSON yourself with a system or a user message.

### Embed

Text can be converted into a vector embedding for similarity comparison usage via:

```ruby
response = client.embed('The quick brown fox jumps over a lazy dog.')
response.embedding # [0.0, ...]
```