Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mochi-neko/ChatGPT-API-unity
A client library of ChatGPT chat completion API for Unity.
https://github.com/mochi-neko/ChatGPT-API-unity
chatgpt-api unity
Last synced: 29 days ago
JSON representation
A client library of ChatGPT chat completion API for Unity.
- Host: GitHub
- URL: https://github.com/mochi-neko/ChatGPT-API-unity
- Owner: mochi-neko
- License: mit
- Created: 2023-03-04T00:32:36.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-11-07T11:28:31.000Z (about 1 year ago)
- Last Synced: 2024-11-09T18:42:17.697Z (about 1 month ago)
- Topics: chatgpt-api, unity
- Language: C#
- Homepage:
- Size: 182 KB
- Stars: 119
- Watchers: 2
- Forks: 14
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
- ai-game-devtools - ChatGPT-API-unity
README
# ChatGPT-API-unity
A client library of [ChatGPT chat completion API](https://platform.openai.com/docs/api-reference/chat/create) for Unity.
See also official [document](https://platform.openai.com/docs/guides/chat) and [API reference](https://platform.openai.com/docs/api-reference/chat).
## How to import by Unity Package Manager
Add following dependencies to your `/Packages/mainfest.json`.
```json
{
"dependencies": {
"com.mochineko.chatgpt-api": "https://github.com/mochi-neko/ChatGPT-API-unity.git?path=/Assets/Mochineko/ChatGPT_API#0.7.3",
...
}
}
```## How to use chat completion by ChatGPT API
1. Generate API key on [OpenAI](https://platform.openai.com/account/api-keys). (Take care your API key, this is a secret information then you should not open.)
2. You can specify chat model. (Available models are defined by `Model`.)
3. Create an instance of `ChatCompletionAPIConnection` with API key and chat model. (This instance memorizes old messages in session.)
4. You can set system message (prompt) to instruct assistant with your situation by constructor of `ChatCompletionAPIConnection`.
5. Input user message and call `ChatCompletionAPIConnection.CompleteChatAsync()`.
6. Response message is in `ChatCompletionResponseBody.ResultMessage` (= `ChatCompletionResponseBody.Choices[0].Message.Content`).An essential sample code with [UniTask](https://github.com/Cysharp/UniTask) is as follows:
```csharp
#nullable enable
using System;
using System.Threading;
using Cysharp.Threading.Tasks;
using Mochineko.ChatGPT_API.Memories;
using UnityEngine;namespace Mochineko.ChatGPT_API.Samples
{
///
/// A sample component to complete chat by ChatGPT API on Unity.
///
public sealed class ChatCompletionSample : MonoBehaviour
{
///
/// API key generated by OpenAPI.
///
[SerializeField] private string apiKey = string.Empty;///
/// System message to instruct assistant.
///
[SerializeField, TextArea] private string systemMessage = string.Empty;///
/// Message sent to ChatGPT API.
///
[SerializeField, TextArea] private string message = string.Empty;///
/// Max number of chat memory of queue.
///
[SerializeField] private int maxMemoryCount = 20;private ChatCompletionAPIConnection? connection;
private IChatMemory? memory;private void Start()
{
// API Key must be set.
if (string.IsNullOrEmpty(apiKey))
{
Debug.LogError("OpenAI API key must be set.");
return;
}memory = new FiniteQueueChatMemory(maxMemoryCount);
// Create instance of ChatGPTConnection with specifying chat model.
connection = new ChatCompletionAPIConnection(
apiKey,
memory,
systemMessage);
}[ContextMenu(nameof(SendChat))]
public void SendChat()
{
SendChatAsync(this.GetCancellationTokenOnDestroy()).Forget();
}
[ContextMenu(nameof(ClearChatMemory))]
public void ClearChatMemory()
{
memory?.ClearAllMessages();
}
private async UniTask SendChatAsync(CancellationToken cancellationToken)
{
// Validations
if (connection == null)
{
Debug.LogError($"[ChatGPT_API.Samples] Connection is null.");
return;
}if (string.IsNullOrEmpty(message))
{
Debug.LogError($"[ChatGPT_API.Samples] Chat content is empty.");
return;
}ChatCompletionResponseBody response;
try
{
await UniTask.SwitchToThreadPool();
// Create message by ChatGPT chat completion API.
response = await connection.CompleteChatAsync(
message,
cancellationToken);
}
catch (Exception e)
{
// Exceptions should be caught.
Debug.LogException(e);
return;
}await UniTask.SwitchToMainThread(cancellationToken);
// Log chat completion result.
Debug.Log($"[ChatGPT_API.Samples] Result:\n{response.ResultMessage}");
}
}
}
```See also [Sample](./Assets/Mochineko/ChatGPT_API.Samples/ChatCompletionSample.cs).
## How to use chat completion by ChatGPT API more resilient
See `RelentChatCompletionAPIConnection` and `RelentChatCompletionSample`
using [Relent](https://github.com/mochi-neko/Relent).You can use API with explicit error handling, retry, timeout, bulkhead, and so on.
```json
{
"dependencies": {
"com.mochineko.chatgpt-api.relent": "https://github.com/mochi-neko/ChatGPT-API-unity.git?path=/Assets/Mochineko/ChatGPT_API.Relent#0.7.3",
"com.mochineko.chatgpt-api": "https://github.com/mochi-neko/ChatGPT-API-unity.git?path=/Assets/Mochineko/ChatGPT_API#0.7.3",
"com.mochineko.relent": "https://github.com/mochi-neko/Relent.git?path=/Assets/Mochineko/Relent#0.2.0",
"com.cysharp.unitask": "https://github.com/Cysharp/UniTask.git?path=src/UniTask/Assets/Plugins/UniTask",
...
}
}
```## How to calculate token length of text in local
You can calculate token length of text
by `TiktokenSharp.Tiktoken.Encode(string)` as follows:```csharp
using TiktokenSharp;private int CalculateTokenLength()
{
string text = "A text that you want to calculate token length.";// Specify model name.
Tiktoken tiktoken = TikToken.EncodingForModel("gpt-3.5-turbo");
// Encoding is tokenizing.
int[] tokens = tikToken.Encode(text);
return tokens.Length;
}
```If you want to calculate on Unity editor,
please use `ItemuMenu > Mochineko > TiktokenEditor` window.## How to customize chat memories
You can use customized memories of chat by implementing `IChatMemory` interface.
Presets are available:
- `FiniteQueueChatMemory`
- A queue that has max number of messages.
- `FiniteQueueWithFixedPromptsChatMemory`
- A queue that has max number of user/assistant messages and free number of prompts (system messages).
- `FiniteTokenLengthQueueChatMemory`
- A queue that has max number of token lenght of all messages.
- `FiniteTokenLengthQueueWithFixedPromptsChatMemory`
- A queue that has max number of token lenght of user/assistant messages and free number of prompts (system messages).## How to stream response
See [streaming sample](./Assets/Mochineko/ChatGPT_API.Samples/ChatCompletionAsStreamSample.cs).
You can await foreach as follows:
```csharp
var builder = new StringBuilder();
// Receive enumerable from ChatGPT chat completion API.
var enumerable = await connection.CompleteChatAsStreamAsync(
message,
cancellationToken);await foreach (var chunk in enumerable.WithCancellation(cancellationToken))
{
// First chunk has only "role" element.
if (chunk.Choices[0].Delta.Content is null)
{
Debug.Log($"[ChatGPT_API.Samples] Role:{chunk.Choices[0].Delta.Role}.");
continue;
}
var delta = chunk.Choices[0].Delta.Content;
builder.Append(delta);
Debug.Log($"[ChatGPT_API.Samples] Delta:{delta}, Current:{builder}");
}// Log chat completion result.
Debug.Log($"[ChatGPT_API.Samples] Completed: \n{builder}");
```
.## How to use function calling
1. Define function with JSON schema.
2. Specify function by request parameters.
3. Call chat completion API.
4. Use `result.Choices[0].Message.FunctionCall`See `FunctionCalling()` in [test code](./Assets/Mochineko/ChatGPT_API.Tests/ChatCompletionAPIConnectionTest.cs).
## Changelog
See [CHANGELOG](./CHANGELOG.md).
## 3rd Party Notices
See [NOTICE](./NOTICE.md).
## License
Licensed under the [MIT](./LICENSE) license.