Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ashmadev/react-ollama-ui

Awesome UI for interacting with your local LLMs
https://github.com/ashmadev/react-ollama-ui

ai chatbot llm-inference ollama

Last synced: 2 months ago
JSON representation

Awesome UI for interacting with your local LLMs

Awesome Lists containing this project

README

        

# React Ollama UI

React Ollama UI is a web interface for [ollama.ai](https://ollama.ai/download), a tool that enables running Large Language Models (LLMs) on your local machine.


React Ollama UI preview

Check out [live preview](https://react-ollama-ui.vercel.app/)!

## ⚙️ Installation

### Prerequisites

1. Download and install [Ollama CLI](https://ollama.ai/download).

2. Run your selected model [Ollama library](https://ollama.com/library).

```bash
ollama run
```

3. Download and install [pnpm](https://pnpm.io/installation) and [node](https://nodejs.org/en/download).

### Getting Started

1. Clone the repository and start your dev server.

```bash
git clone https://github.com/AshmaDev/react-ollama-ui.git
cd react-ollama-ui
pnpm install
pnpm run dev
```

## 🐳 Quick Start with Docker

> [!NOTE]
> The current Docker Compose configuration runs Ollama on CPU only. If you wish to use an NVIDIA or AMD GPU, you will need to modify the `docker-compose.yml` file. For more details, visit the [Ollama Docker Hub page](https://hub.docker.com/r/ollama/ollama).

```bash
docker compose up -d
```

---

## 🛠 Built With

- [Ollama.ai](https://ollama.ai/)
- [React.js](https://react.dev/)
- [Vite](https://vitejs.dev/)
- [Tailwind CSS](https://tailwindcss.com/)
- [@phosphor-icons/react](https://phosphoricons.com)

---

## 📝 License

Licensed under the MIT License. See the [LICENSE](LICENSE.md) file for details.