https://github.com/rockchinq/callinggpt
Build your own ChatGPT plugin platform with GPT's function calling ability | func call by GPT
https://github.com/rockchinq/callinggpt
chatgpt func-call gpt openai
Last synced: 1 day ago
JSON representation
Build your own ChatGPT plugin platform with GPT's function calling ability | func call by GPT
- Host: GitHub
- URL: https://github.com/rockchinq/callinggpt
- Owner: RockChinQ
- Created: 2023-06-17T03:57:58.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-07-08T02:32:26.000Z (over 2 years ago)
- Last Synced: 2025-01-14T00:54:42.554Z (9 months ago)
- Topics: chatgpt, func-call, gpt, openai
- Language: Python
- Homepage:
- Size: 49.8 KB
- Stars: 66
- Watchers: 3
- Forks: 3
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# CallingGPT
English | [简体中文](README_cn.md)
[](https://pypi.python.org/pypi/CallingGPT)
GPT's Function Calling Demo, a experiment of self-hosted ChatGPT-Plugins-like platform.
> Recommend reading: [function-calling](https://platform.openai.com/docs/guides/gpt/function-calling)
## Abstract
OpenAI's GPT models provide a function calling feature, so we can easily create `ChatGPT-Plugins-like` tools. This repository is a proof-of-concept of the function calling feature.
In this experiment, we defined the `Plugin` as `Namespace` which contains a serial of functions. While user performing a conversation, the functions in `Namespace` will be called by the API and return the result to the user.## Usage
1. Clone this repository and install the dependencies.
```bash
git clone https://github.com/RockChinQ/CallingGPT
cd CallingGPT
pip install -r requirements.txt
```2. Run the `main.py` to generate `config.yaml`
```bash
python main.py
```3. Edit the `config.yaml` to set your API key and other settings.
4. Run the `main.py` and pass your modules.```bash
python main.py ...
```## Example
Use the `example/greet.py`, provides a `greet` function called when user ask GPT to greet someone.
```bash
python main.py example/greet.py
```Then you can talk to the bot.
```
$ python main.py examples/greet.py
Using module: examples.greet
>>> Hello and who are you?
<<< Hello! I am an AI assistant. How can I assist you today?
>>> say hello to Rock
call: {
"user": "Rock"
}
<<< Hello, Rock! How can I assist you today?
>>> and to Alice
call: {
"user": "Alice"
}
<<< Hello, Alice! How can I assist you today?
>>>
```Type `help` to get help.
See [wiki](https://github.com/RockChinQ/CallingGPT/wiki/1.-Function-Format) for the function format.### Other Examples
examples/draw_and_wrapper_md.py
Provides a `dalle_draw` function to use DALL·E model when user ask GPT to draw something.
```bash
python main.py examples/draw_and_wrapper_md.py
``````
$ python main.py examples/draw_and_wrapper_md.py
Using module: examples.draw_and_wrapper_md
>>> hello!
<<< Hi there! How can I assist you today?
>>> draw a sunset for me please
call: {
"prompt": "sunset"
}
<<< Sure! Here's a beautiful sunset for you:
I hope you like it! Let me know if there's anything else I can help you with.
>>>
```## For Code
1. Install the package
```bash
pip install --upgrade CallingGPT
```2. Create your own functions in modules(these modules can also be used in the CLI mode)
```python
# your_module_a.py
def func_a(prompt: str) -> str: # Type hint of EACH argument and return value is REQUIRED.
"""
The description of this func a, will be provided to the api.Args:
prompt(str): The prompt of the function.Returns:
The result of the function.
"""
# Google style docstring is REQUIRED, it will be split into
# `description` and `params`(required if there are args) and
# `returns`(optional), `\n\n` between each part.
return "func_a: " + prompt
``````python
# your_module_b.py
def adder(a: int, b: int) -> int:
"""
Add two numbers.Args:
a: The first number.
b: The second number.Returns:
The sum of a and b.
"""
# Type hints of args in docstring is optional.
return a + b
```3. Call the wrapper
```python
from CallingGPT.session.session import Session
import your_module_a, your_module_b
import openaiopenai.api_key = 'your_openai_api_key'
session = Session([your_module_a, your_module_b])
for reply in session.ask("your prompt"):
# session.ask will yield each time the api returns a result,
# before calling function, it will print the function name and args.
# e.g. here's a function call:
# {
# "role": "assistant",
# "content": null,
# "function_call": {
# "name": "examples-draw_and_wrapper_md-draw",
# "arguments": "{\n \"prompt\": \"cat\"\n}"
# }
# }
#
# while here's a normal reply:
# {
# "role": "assistant",
# "content": "Hello, I am an AI assistant. How can I assist you today?"
# }
print(reply)
````Session` will automatically manage context for you.
See [wiki](https://github.com/RockChinQ/CallingGPT/wiki/1.-Function-Format) for the function format.