https://github.com/llm-workflow-engine/lwe-plugin-provider-azure-openai-chat
LLM Workflow Engine (LWE) Azure Chat OpenAI Provider plugin
https://github.com/llm-workflow-engine/lwe-plugin-provider-azure-openai-chat
Last synced: 2 months ago
JSON representation
LLM Workflow Engine (LWE) Azure Chat OpenAI Provider plugin
- Host: GitHub
- URL: https://github.com/llm-workflow-engine/lwe-plugin-provider-azure-openai-chat
- Owner: llm-workflow-engine
- Created: 2023-09-03T02:03:35.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-04-10T21:52:33.000Z (about 1 year ago)
- Last Synced: 2024-04-11T00:50:38.237Z (about 1 year ago)
- Language: Python
- Size: 7.81 KB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# LLM Workflow Engine (LWE) Azure OpenAI Chat Provider plugin
Azure OpenAI Chat Provider plugin for [LLM Workflow Engine](https://github.com/llm-workflow-engine/llm-workflow-engine)
Access to [Azure OpenAI Chat](https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models) models.
## Installation
### Azure setup
You'll need to create a service resource and deploy models see [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=web-portal).
You'll also need a key and endpoint for the resource, see [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart?tabs=command-line&pivots=programming-language-python#retrieve-key-and-endpoint)
### From packages
Install the latest version of this software directly from github with pip:
```bash
pip install git+https://github.com/llm-workflow-engine/lwe-plugin-provider-azure-openai-chat
```### From source (recommended for development)
Install the latest version of this software directly from git:
```bash
git clone https://github.com/llm-workflow-engine/lwe-plugin-provider-azure-openai-chat.git
```Install the development package:
```bash
cd lwe-plugin-provider-azure-openai-chat
pip install -e .
```## Configuration
The following provider variables/environment variables need to be set:
* `export AZURE_OPENAI_API_KEY=[key]`
* `export AZURE_ENDPOINT=[endpoint]`
* `export AZURE_OPENAI_API_VERSION=2024-02-01`Add the following to `config.yaml` in your profile:
```yaml
plugins:
enabled:
- provider_azure_openai_chat
# Any other plugins you want enabled...
```## Usage
From a running LWE shell:
```
/provider azure_openai_chat
/model deployment_name gpt-35-turbo
# Instead of environment variables, these values can also be set directly on the model:
/model openai_api_key [key]
/model openai_endpoint [endpoint]
/model openai_api_version 2023-05-15
```