Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/mochi-neko/ChatGPT-API-unity

A client library of ChatGPT chat completion API for Unity.
https://github.com/mochi-neko/ChatGPT-API-unity

chatgpt-api unity

Last synced: about 1 month ago
JSON representation

A client library of ChatGPT chat completion API for Unity.

Awesome Lists containing this project

README

        

# ChatGPT-API-unity

A client library of [ChatGPT chat completion API](https://platform.openai.com/docs/api-reference/chat/create) for Unity.

See also official [document](https://platform.openai.com/docs/guides/chat) and [API reference](https://platform.openai.com/docs/api-reference/chat).

## How to import by Unity Package Manager

Add following dependencies to your `/Packages/mainfest.json`.

```json
{
"dependencies": {
"com.mochineko.chatgpt-api": "https://github.com/mochi-neko/ChatGPT-API-unity.git?path=/Assets/Mochineko/ChatGPT_API#0.7.3",
...
}
}
```

## How to use chat completion by ChatGPT API

1. Generate API key on [OpenAI](https://platform.openai.com/account/api-keys). (Take care your API key, this is a secret information then you should not open.)
2. You can specify chat model. (Available models are defined by `Model`.)
3. Create an instance of `ChatCompletionAPIConnection` with API key and chat model. (This instance memorizes old messages in session.)
4. You can set system message (prompt) to instruct assistant with your situation by constructor of `ChatCompletionAPIConnection`.
5. Input user message and call `ChatCompletionAPIConnection.CompleteChatAsync()`.
6. Response message is in `ChatCompletionResponseBody.ResultMessage` (= `ChatCompletionResponseBody.Choices[0].Message.Content`).

An essential sample code with [UniTask](https://github.com/Cysharp/UniTask) is as follows:

```csharp
#nullable enable
using System;
using System.Threading;
using Cysharp.Threading.Tasks;
using Mochineko.ChatGPT_API.Memories;
using UnityEngine;

namespace Mochineko.ChatGPT_API.Samples
{
///
/// A sample component to complete chat by ChatGPT API on Unity.
///
public sealed class ChatCompletionSample : MonoBehaviour
{
///
/// API key generated by OpenAPI.
///
[SerializeField] private string apiKey = string.Empty;

///
/// System message to instruct assistant.
///
[SerializeField, TextArea] private string systemMessage = string.Empty;

///
/// Message sent to ChatGPT API.
///
[SerializeField, TextArea] private string message = string.Empty;

///
/// Max number of chat memory of queue.
///
[SerializeField] private int maxMemoryCount = 20;

private ChatCompletionAPIConnection? connection;
private IChatMemory? memory;

private void Start()
{
// API Key must be set.
if (string.IsNullOrEmpty(apiKey))
{
Debug.LogError("OpenAI API key must be set.");
return;
}

memory = new FiniteQueueChatMemory(maxMemoryCount);

// Create instance of ChatGPTConnection with specifying chat model.
connection = new ChatCompletionAPIConnection(
apiKey,
memory,
systemMessage);
}

[ContextMenu(nameof(SendChat))]
public void SendChat()
{
SendChatAsync(this.GetCancellationTokenOnDestroy()).Forget();
}

[ContextMenu(nameof(ClearChatMemory))]
public void ClearChatMemory()
{
memory?.ClearAllMessages();
}

private async UniTask SendChatAsync(CancellationToken cancellationToken)
{
// Validations
if (connection == null)
{
Debug.LogError($"[ChatGPT_API.Samples] Connection is null.");
return;
}

if (string.IsNullOrEmpty(message))
{
Debug.LogError($"[ChatGPT_API.Samples] Chat content is empty.");
return;
}

ChatCompletionResponseBody response;
try
{
await UniTask.SwitchToThreadPool();

// Create message by ChatGPT chat completion API.
response = await connection.CompleteChatAsync(
message,
cancellationToken);
}
catch (Exception e)
{
// Exceptions should be caught.
Debug.LogException(e);
return;
}

await UniTask.SwitchToMainThread(cancellationToken);

// Log chat completion result.
Debug.Log($"[ChatGPT_API.Samples] Result:\n{response.ResultMessage}");
}
}
}
```

See also [Sample](./Assets/Mochineko/ChatGPT_API.Samples/ChatCompletionSample.cs).

## How to use chat completion by ChatGPT API more resilient

See `RelentChatCompletionAPIConnection` and `RelentChatCompletionSample`
using [Relent](https://github.com/mochi-neko/Relent).

You can use API with explicit error handling, retry, timeout, bulkhead, and so on.

```json
{
"dependencies": {
"com.mochineko.chatgpt-api.relent": "https://github.com/mochi-neko/ChatGPT-API-unity.git?path=/Assets/Mochineko/ChatGPT_API.Relent#0.7.3",
"com.mochineko.chatgpt-api": "https://github.com/mochi-neko/ChatGPT-API-unity.git?path=/Assets/Mochineko/ChatGPT_API#0.7.3",
"com.mochineko.relent": "https://github.com/mochi-neko/Relent.git?path=/Assets/Mochineko/Relent#0.2.0",
"com.cysharp.unitask": "https://github.com/Cysharp/UniTask.git?path=src/UniTask/Assets/Plugins/UniTask",
...
}
}
```

## How to calculate token length of text in local

You can calculate token length of text
by `TiktokenSharp.Tiktoken.Encode(string)` as follows:

```csharp
using TiktokenSharp;

private int CalculateTokenLength()
{
string text = "A text that you want to calculate token length.";

// Specify model name.
Tiktoken tiktoken = TikToken.EncodingForModel("gpt-3.5-turbo");

// Encoding is tokenizing.
int[] tokens = tikToken.Encode(text);

return tokens.Length;
}
```

If you want to calculate on Unity editor,
please use `ItemuMenu > Mochineko > TiktokenEditor` window.

## How to customize chat memories

You can use customized memories of chat by implementing `IChatMemory` interface.

Presets are available:

- `FiniteQueueChatMemory`
- A queue that has max number of messages.
- `FiniteQueueWithFixedPromptsChatMemory`
- A queue that has max number of user/assistant messages and free number of prompts (system messages).
- `FiniteTokenLengthQueueChatMemory`
- A queue that has max number of token lenght of all messages.
- `FiniteTokenLengthQueueWithFixedPromptsChatMemory`
- A queue that has max number of token lenght of user/assistant messages and free number of prompts (system messages).

## How to stream response

See [streaming sample](./Assets/Mochineko/ChatGPT_API.Samples/ChatCompletionAsStreamSample.cs).

You can await foreach as follows:

```csharp
var builder = new StringBuilder();
// Receive enumerable from ChatGPT chat completion API.
var enumerable = await connection.CompleteChatAsStreamAsync(
message,
cancellationToken);

await foreach (var chunk in enumerable.WithCancellation(cancellationToken))
{
// First chunk has only "role" element.
if (chunk.Choices[0].Delta.Content is null)
{
Debug.Log($"[ChatGPT_API.Samples] Role:{chunk.Choices[0].Delta.Role}.");
continue;
}

var delta = chunk.Choices[0].Delta.Content;
builder.Append(delta);
Debug.Log($"[ChatGPT_API.Samples] Delta:{delta}, Current:{builder}");
}

// Log chat completion result.
Debug.Log($"[ChatGPT_API.Samples] Completed: \n{builder}");
```
.

## How to use function calling

1. Define function with JSON schema.
2. Specify function by request parameters.
3. Call chat completion API.
4. Use `result.Choices[0].Message.FunctionCall`

See `FunctionCalling()` in [test code](./Assets/Mochineko/ChatGPT_API.Tests/ChatCompletionAPIConnectionTest.cs).

## Changelog

See [CHANGELOG](./CHANGELOG.md).

## 3rd Party Notices

See [NOTICE](./NOTICE.md).

## License

Licensed under the [MIT](./LICENSE) license.