https://github.com/ksylvest/omniai-mistral
An implementation of the OmniAI interface for Mistral.
https://github.com/ksylvest/omniai-mistral
lechat mistral omniai ruby
Last synced: 3 days ago
JSON representation
An implementation of the OmniAI interface for Mistral.
- Host: GitHub
- URL: https://github.com/ksylvest/omniai-mistral
- Owner: ksylvest
- License: mit
- Created: 2024-06-10T19:16:03.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-09-17T19:15:47.000Z (22 days ago)
- Last Synced: 2025-09-17T21:34:47.116Z (21 days ago)
- Topics: lechat, mistral, omniai, ruby
- Language: Ruby
- Homepage: https://omniai-mistral.ksylvest.com
- Size: 110 KB
- Stars: 3
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# OmniAI::Mistral
[](https://github.com/ksylvest/omniai-mistral/blob/main/LICENSE)
[](https://rubygems.org/gems/omniai-mistral)
[](https://github.com/ksylvest/omniai-mistral)
[](https://omniai-mistral.ksylvest.com)
[](https://circleci.com/gh/ksylvest/omniai-mistral)An Mistral implementation of the [OmniAI](https://github.com/ksylvest/omniai) APIs.
## Installation
```sh
gem install omniai-mistral
```## Usage
### Client
A client is setup as follows if `ENV['MISTRAL_API_KEY']` exists:
```ruby
client = OmniAI::Mistral::Client.new
```A client may also be passed the following options:
- `api_key` (required - default is `ENV['MISTRAL_API_KEY']`)
- `host` (optional)### Configuration
Global configuration is supported for the following options:
```ruby
OmniAI::Mistral.configure do |config|
config.api_key = 'sk-...' # default: ENV['MISTRAL_API_KEY']
config.host = '...' # default: 'https://api.mistral.ai'
end
```### Chat
A chat completion is generated by passing in prompts using any a variety of formats:
```ruby
completion = client.chat('Tell me a joke!')
completion.text # 'Why did the chicken cross the road? To get to the other side.'
``````ruby
completion = client.chat do |prompt|
prompt.system('You are a helpful assistant.')
prompt.user('What is the capital of Canada?')
end
completion.text # 'The capital of Canada is Ottawa.'
```#### Model
`model` takes an optional string (default is `mistral-medium-latest`):
```ruby
completion = client.chat('Provide code for fibonacci', model: OmniAI::Mistral::Chat::Model::CODESTRAL)
completion.text # 'def fibonacci(n)...end'
```[Mistral API Reference `model`](https://docs.mistral.ai/getting-started/models/)
#### Temperature
`temperature` takes an optional float between `0.0` and `1.0` (defaults is `0.7`):
```ruby
completion = client.chat('Pick a number between 1 and 5', temperature: 1.0)
completion.text # '3'
```[Mistral API Reference `temperature`](https://docs.mistral.ai/api/)
#### Stream
`stream` takes an optional a proc to stream responses in real-time chunks instead of waiting for a complete response:
```ruby
stream = proc do |chunk|
print(chunk.text) # 'Better', 'three', 'hours', ...
end
client.chat('Be poetic.', stream:)
```[Mistral API Reference `stream`](https://docs.mistral.ai/api/)
#### Format
`format` takes an optional symbol (`:json`) and that sets the `response_format` to `json_object`:
```ruby
completion = client.chat(format: :json) do |prompt|
prompt.system(OmniAI::Chat::JSON_PROMPT)
prompt.user('What is the name of the drummer for the Beatles?')
end
JSON.parse(completion.text) # { "name": "Ringo" }
```[Mistral API Reference `response_format`](https://docs.mistral.ai/api/)
> When using JSON mode you MUST also instruct the model to produce JSON yourself with a system or a user message.
### Embed
Text can be converted into a vector embedding for similarity comparison usage via:
```ruby
response = client.embed('The quick brown fox jumps over a lazy dog.')
response.embedding # [0.0, ...]
```