https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-ollama
LLM Workflow Engine (LWE) Chat Ollama Provider plugin
https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-ollama
Last synced: 2 months ago
JSON representation
LLM Workflow Engine (LWE) Chat Ollama Provider plugin
- Host: GitHub
- URL: https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-ollama
- Owner: llm-workflow-engine
- Created: 2023-11-18T20:26:05.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-07-07T00:22:45.000Z (10 months ago)
- Last Synced: 2025-01-08T13:02:43.584Z (4 months ago)
- Language: Python
- Size: 6.84 KB
- Stars: 2
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# LLM Workflow Engine (LWE) Chat Ollama Provider plugin
Chat Ollama Provider plugin for [LLM Workflow Engine](https://github.com/llm-workflow-engine/llm-workflow-engine)
Access to [Ollama](https://ollama.ai/library) chat models.
## Installation
### Ollama local server
Follow the [installation instructions](https://github.com/ollama/ollama) for Ollama, and make sure the server is running on port `11434`.
### Plugin
#### From packages
Install the latest version of this software directly from github with pip:
```bash
pip install git+https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-ollama
```#### From source (recommended for development)
Install the latest version of this software directly from git:
```bash
git clone https://github.com/llm-workflow-engine/lwe-plugin-provider-chat-ollama.git
```Install the development package:
```bash
cd lwe-plugin-provider-chat-ollama
pip install -e .
```## Configuration
Add the following to `config.yaml` in your profile:
```yaml
plugins:
enabled:
- provider_chat_ollama
# Any other plugins you want enabled...
```## Usage
From a running LWE shell:
```
/provider chat_ollama
/model model llama2
```