Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ivanfioravanti/chatbot-ollama
Chatbot Ollama is an open source chat UI for Ollama.
https://github.com/ivanfioravanti/chatbot-ollama
Last synced: 3 days ago
JSON representation
Chatbot Ollama is an open source chat UI for Ollama.
- Host: GitHub
- URL: https://github.com/ivanfioravanti/chatbot-ollama
- Owner: ivanfioravanti
- License: other
- Created: 2023-10-01T23:59:36.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-08-05T18:16:59.000Z (5 months ago)
- Last Synced: 2025-01-02T19:03:27.538Z (10 days ago)
- Language: TypeScript
- Size: 395 KB
- Stars: 1,499
- Watchers: 17
- Forks: 257
- Open Issues: 23
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Security: SECURITY.md
Awesome Lists containing this project
- awesome-local-llms - chatbot-ollama
- awesome - ivanfioravanti/chatbot-ollama - Chatbot Ollama is an open source chat UI for Ollama. (TypeScript)
- Awesome-Ollama - Chatbot UI
- StarryDivineSky - ivanfioravanti/chatbot-ollama - ui 项目开发。它允许用户通过简单的界面与 Ollama 模型进行交互,并支持自定义模型、系统提示、温度等参数。用户可以通过 Docker 镜像或本地运行的方式使用 Chatbot Ollama,并通过环境变量配置默认模型、系统提示和温度等参数。 (A01_文本生成_文本对话 / 大语言对话模型及数据)
README
# Chatbot Ollama
## About
Chatbot Ollama is an open source chat UI for Ollama.
This project is based on [chatbot-ui](https://github.com/mckaywrigley/chatbot-ui) by [Mckay Wrigley](https://github.com/mckaywrigley).
![Chatbot Ollama](./public/screenshots/screenshot-2023-10-02.png)
## Updates
Chatbot Ollama will be updated over time.
### Next up
- [ ] pull a model
- [ ] delete a model
- [ ] show model information## Docker
Build locally:
```shell
docker build -t chatbot-ollama .
docker run -p 3000:3000 chatbot-ollama
```Pull from ghcr:
```bash
docker run -p 3000:3000 ghcr.io/ivanfioravanti/chatbot-ollama:main
```## Running Locally
### 1. Clone Repo
```bash
git clone https://github.com/ivanfioravanti/chatbot-ollama.git
```### 2. Move to folder
```bash
cd chatbot-ollama
```### 3. Install Dependencies
```bash
npm ci
```### 4. Run Ollama server
Either via the cli:
```bash
ollama serve
```or via the [desktop client](https://ollama.ai/download)
### 5. Run App
```bash
npm run dev
```### 6. Use It
You should be able to start chatting.
## Configuration
When deploying the application, the following environment variables can be set:
| Environment Variable | Default value | Description |
| --------------------------------- | ------------------------------ | ----------------------------------------------------------------------------------------------------------------------------------------- |
| DEFAULT_MODEL | `mistral:latest` | The default model to use on new conversations |
| NEXT_PUBLIC_DEFAULT_SYSTEM_PROMPT | [see here](utils/app/const.ts) | The default system prompt to use on new conversations |
| NEXT_PUBLIC_DEFAULT_TEMPERATURE | 1 | The default temperature to use on new conversations |## Contact
If you have any questions, feel free to reach out to me on [X](https://x.com/ivanfioravanti).