Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ahmetkca/polyollama
Run multiple open source large language models concurrently powered by Ollama
https://github.com/ahmetkca/polyollama
chatbot large-language-models ollama ollama-interface
Last synced: 25 days ago
JSON representation
Run multiple open source large language models concurrently powered by Ollama
- Host: GitHub
- URL: https://github.com/ahmetkca/polyollama
- Owner: ahmetkca
- Created: 2024-02-08T04:03:35.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2024-04-14T13:33:43.000Z (9 months ago)
- Last Synced: 2024-05-30T02:19:19.544Z (8 months ago)
- Topics: chatbot, large-language-models, ollama, ollama-interface
- Language: TypeScript
- Homepage:
- Size: 443 KB
- Stars: 20
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# PolyOllama
Run multiple same or different open source large language models such as [Llama2](https://ollama.com/library/llama2), [Mistral](https://ollama.com/library/mistral) and [Gemma](https://ollama.com/library/gemma) in parallel simultaneously powered by [Ollama](https://ollama.com/).
## Demo
https://github.com/ahmetkca/PolyOllama/assets/74574469/f0084d3c-6223-4f7e-9442-2aa5f79af10d
## Instructions to run it locally
> You need [Ollama](ollama.ai) installed on your computer.
cmd + k (to open the chat prompt)
alt + k (on Windows)```bash
cd backend
bun install
bun run index.ts
``````bash
cd frontend
bun install
bun run dev
```> Running in docker containers frontend + (backend + ollama)
On Windows
```bash
docker compose -f docker-compose.windows.yml up
```On Linux/MacOS
```bash
docker compose -f docker-compose.unix.yml up
```frontend available at http://localhost:5173
> :warning: **Still work in progress**