https://github.com/drcbeatz/ollama-python-chat
Simple chat application web app using Python, Flask, Ollama, and Docker. Utilizes the open source LLM llama3.1:8b.
https://github.com/drcbeatz/ollama-python-chat
chat-application docker flask llama3-1 llm ollama python
Last synced: 2 months ago
JSON representation
Simple chat application web app using Python, Flask, Ollama, and Docker. Utilizes the open source LLM llama3.1:8b.
- Host: GitHub
- URL: https://github.com/drcbeatz/ollama-python-chat
- Owner: DrCBeatz
- Created: 2024-07-30T02:39:19.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-08-02T14:09:20.000Z (about 1 year ago)
- Last Synced: 2025-01-11T08:30:31.408Z (9 months ago)
- Topics: chat-application, docker, flask, llama3-1, llm, ollama, python
- Language: JavaScript
- Homepage:
- Size: 13.7 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Ollama Python Chat
A simple chat web app using Ollama and the dolphin-mistral:7b model. This project is suitable for running locally on a desktop or laptop, even without a GPU.
## Prerequisites
- Docker
## Getting Started
### 1. Clone the Repository
Clone this repository to your local machine using the following command:
`git clone https://github.com/yourusername/ollama-python-chat.git`
Then navigate to the project directory:
`cd ollama-python-chat`
### 2. Create a .env File
Create a .env file in the root of the project directory and add the following line to specify the LLM model name:
`MODEL_NAME=llama3.1:8b`
You can change `llama3.1:8b` to any other model you want to use.
### 3. Build and Run the Docker Containers
Build and run the Docker containers using Docker Compose:
`docker compose up -d --build`
### 4. Pull the Model
Pull the llama3.1:8b model using the following command:
`docker compose exec ollama ollama pull llama3.1:8b`
### 5. Access the Web App
Open your web browser and go to the following URL:
`http://localhost:5000/`
### 6. Stop the Containers
When you're done, stop the containers using:
`docker compose down`
### License
This project is licensed under the MIT License.