An open API service indexing awesome lists of open source software.

https://github.com/ksylvest/omniai-deepseek

An implementation of the OmniAI interface for DeepSeek.
https://github.com/ksylvest/omniai-deepseek

deepseek omniai ruby

Last synced: 7 months ago
JSON representation

An implementation of the OmniAI interface for DeepSeek.

Awesome Lists containing this project

README

          

# OmniAI::DeepSeek

[![LICENSE](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/ksylvest/omniai-deepseek/blob/main/LICENSE)
[![RubyGems](https://img.shields.io/gem/v/omniai-deepseek)](https://rubygems.org/gems/omniai-deepseek)
[![GitHub](https://img.shields.io/badge/github-repo-blue.svg)](https://github.com/ksylvest/omniai-deepseek)
[![Yard](https://img.shields.io/badge/docs-site-blue.svg)](https://omniai-deepseek.ksylvest.com)
[![CircleCI](https://img.shields.io/circleci/build/github/ksylvest/omniai-deepseek)](https://circleci.com/gh/ksylvest/omniai-deepseek)

An DeepSeek implementation of the [OmniAI](https://github.com/ksylvest/omniai) interface for [deepseek](https://www.deepseek.com/).

## Installation

```sh
gem install omniai-deepseek
```

## Usage

### Client

A client is setup as follows if `ENV['DEEPSEEK_API_KEY']` exists:

```ruby
client = OmniAI::DeepSeek::Client.new
```

A client may also be passed the following options:

- `api_key` (required - default is `ENV['DEEPSEEK_API_KEY']`)
- `host` (optional)

### Configuration

Global configuration is supported for the following options:

```ruby
OmniAI::DeepSeek.configure do |config|
config.api_key = 'sk-...' # default: ENV['DEEPSEEK_API_KEY']
config.host = '...' # default: 'https://api.deepseek.com'
end
```

### Chat

A chat completion is generated by passing in a simple text prompt:

```ruby
completion = client.chat('Tell me a joke!')
completion.content # 'Why did the chicken cross the road? To get to the other side.'
```

A chat completion may also be generated by using a prompt builder:

```ruby
completion = client.chat do |prompt|
prompt.system('Your are an expert in geography.')
prompt.user('What is the capital of Canada?')
end
completion.content # 'The capital of Canada is Ottawa.'
```

#### Model

`model` takes an optional string (default is `deepseek-chat`):

```ruby
completion = client.chat('How fast is a cheetah?', model: OmniAI::DeepSeek::Chat::Model::REASONER)
completion.content # 'A cheetah can reach speeds over 100 km/h.'
```

[DeepSeek API Reference `model`](https://api-docs.deepseek.com/quick_start/pricing)

#### Temperature

`temperature` takes an optional float between `0.0` and `2.0` (defaults is `0.7`):

```ruby
completion = client.chat('Pick a number between 1 and 5', temperature: 2.0)
completion.content # '3'
```

[DeepSeek API Reference `temperature`](https://api-docs.deepseek.com/quick_start/parameter_settings)

#### Stream

`stream` takes an optional a proc to stream responses in real-time chunks instead of waiting for a complete response:

```ruby
stream = proc do |chunk|
print(chunk.content) # 'Better', 'three', 'hours', ...
end
client.chat('Be poetic.', stream:)
```

[DeepSeek API Reference `stream`](https://platform.deepseek.com/docs/api-reference/chat/create#chat-create-stream)

#### Format

`format` takes an optional symbol (`:json`) and that setes the `response_format` to `json_object`:

```ruby
completion = client.chat(format: :json) do |prompt|
prompt.system(OmniAI::Chat::JSON_PROMPT)
prompt.user('What is the name of the drummer for the Beatles?')
end
JSON.parse(completion.content) # { "name": "Ringo" }
```

[DeepSeek API Reference `response_format`](https://platform.deepseek.com/docs/api-reference/chat/create#chat-create-stream)

> When using JSON mode, you must also instruct the model to produce JSON yourself via a system or user message.