Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jiggy-ai/chatstack
Minimalist Context Management for message-based GPTs
https://github.com/jiggy-ai/chatstack
chat chat-application chatbot chatgpt chatgpt-api gpt-35-turbo gpt4
Last synced: about 6 hours ago
JSON representation
Minimalist Context Management for message-based GPTs
- Host: GitHub
- URL: https://github.com/jiggy-ai/chatstack
- Owner: jiggy-ai
- License: apache-2.0
- Created: 2023-03-15T23:42:16.000Z (over 1 year ago)
- Default Branch: master
- Last Pushed: 2023-11-07T15:37:03.000Z (about 1 year ago)
- Last Synced: 2024-10-11T13:51:16.577Z (about 1 month ago)
- Topics: chat, chat-application, chatbot, chatgpt, chatgpt-api, gpt-35-turbo, gpt4
- Language: Python
- Homepage:
- Size: 41 KB
- Stars: 22
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-technostructure - jiggy-ai/chatstack - ai/chatstack: Minimalist Context Management for message-based GPTs ([:robot: machine-learning]([robot-machine-learning)](<https://github.com/stars/ketsapiwiq/lists/robot-machine-learning>)))
- awesome-technostructure - jiggy-ai/chatstack - ai/chatstack: Minimalist Context Management for message-based GPTs ([:robot: machine-learning]([robot-machine-learning)](<https://github.com/stars/ketsapiwiq/lists/robot-machine-learning>)))
README
# Chatstack
## Minimalist Context Management for message-based GPTs
This Python code provides a chatbot implementation with context management using OpenAI's GPT-3.5-turbo or GPT-4 chat models. The chatbot maintains a conversation history and help manages the context state and size in tokens.
### Dependencies
- loguru
- pydantic
- openai
- tiktoken### OPEN_API_KEY
Chatstack finds your OpenAI API key via the OPENAI_API_KEY environment variable.
### Classes
- `ChatRoleMessage`: A base data class for messages with role, text, and tokens.
- `SystemMessage`: A data class for representing a message with the 'system' role.
- `ContextMessage`: A data class representing additional information context for the model.
- `AssistantMessage`: A data class for representing a message with the 'assistant' role .
- `UserMessage`: A data class for representing a message with the 'user' role.
- `ChatContext`: A class that manages the conversation context and generates responses using OpenAI message interface models.
- `ChatReponse`: A data class that contains the model response to a user message along with a record of the input context sent to the model, and other significant details such as the model used, the number of tokens used, and the estimated cost of the request.### Usage
1. Import the `ChatContext` class.
2. Create an instance of the `ChatContext` class with the desired configuration.
3. Call the `user_message` or `user_message_stream` methods with the user's message text to get a response from the chatbot.Example:
```python
from chatstack import ChatContextBASE_SYSTEM_PROMPT = "You are a clever bot. Do not apologize, or make excuses. "
BASE_SYSTEM_PROMPT += "Do not mention that you are an AI language model since that is annoying to users."def main():
chat_context = ChatContext(base_system_msg_text=BASE_SYSTEM_PROMPT)print("Welcome to the Chatbot!")
while True:
user_input = input("You: ")
print("Chatbot:")
response = chat_context.user_message(user_input, stream=True)
print(response.text)if __name__ == "__main__":
main()
```### Configuration
The `ChatContext` class accepts the following parameters:
- `min_response_tokens`: Minimum number of tokens to reserve for model completion response.
- `max_response_tokens`: Maximum number of tokens to allow for model completion response.
- `chat_context_messages`: Number of recent assistant and user messages to keep in context.
- `model`: The name of the GPT model to use (default: "gpt-3.5-turbo").
- `temperature`: The temperature for the model's response generation.
- `base_system_msg_text`: The base system message text to provide context for the model.The primary method of the ChatContext is the user_message() which is used to assemble the input context to the model and generate a completion.
### `user_message(msg_text: str) -> ChatResponse`
This method takes a user's message text as input and generates a response from the chatbot using the conversation context.
### `user_message_stream(msg_text: str) -> ChatResponse`
This method is a generator that takes a user's message text as input and yields `ChatResponse` objects containing the incremental and cumulative response text from the chatbot using the conversation context.
### `add_message(msg : ChatRoleMessage)`
Add a message to the context for presentation to the model in subsequent completion requests.
#### Parameters:
- `msg_text` (str): The text of the user's message.
#### Returns:
- `ChatResponse`: An instance of the `ChatResponse` data class that includes the model response text, the actual input messages sent to the model, and other relevant details such as the token counts and estimated price of the completion.