Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ldcmleo/webui-boilerplate
A straightforward Docker project designed to simplify the setup of a Web User Interface (WebUI) alongside an Ollama Service. Using Docker Compose, this project streamlines the process of launching both services, ensuring seamless integration and providing a flexible, scalable environment suitable for various development and testing scenarios.
https://github.com/ldcmleo/webui-boilerplate
docker
Last synced: 17 days ago
JSON representation
A straightforward Docker project designed to simplify the setup of a Web User Interface (WebUI) alongside an Ollama Service. Using Docker Compose, this project streamlines the process of launching both services, ensuring seamless integration and providing a flexible, scalable environment suitable for various development and testing scenarios.
- Host: GitHub
- URL: https://github.com/ldcmleo/webui-boilerplate
- Owner: ldcmleo
- License: mit
- Created: 2024-11-04T04:56:34.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2024-11-06T18:43:31.000Z (3 months ago)
- Last Synced: 2024-11-22T01:08:52.517Z (3 months ago)
- Topics: docker
- Homepage:
- Size: 2.93 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# WebUI-Boilerplate
A simple Docker project with Docker Compose to set up a WebUI and Ollama Service environment.
## Requiriments/Requisitos
- Docker Engine or Docker Desktop installed
- Docker compose plugin## Usage
Clone this repository in a specific folder
```bash
git clone [email protected]:ldcmleo/WebUI-Boilerplate.git
```This will create a folder called WebUI-Boilerplate. Navigate into this folder and start the services:
```bash
cd WebUI-Boilerplate
docker compose up -d
```## Creating a Model for Ollama
This service uses Ollama to create models, providing an environment similar to ChatGPT. To create a new model with your Ollama service, follow these steps:### Using the Ollama container
To access the Ollama container, run the following command:```bash
docker exec -it ollama bash
```Once inside the container, create the new model using an Ollama command, for example:
```bash
ollama run llama3.2
```You can find available models for Ollama here:
![Ollama Website](https://ollama.com/library)