https://github.com/eternalflame02/ollama-chat-interface
Ollama-chat-interface is a simple, lightweight interface built with Flask and Bootstrap 5 for interacting with offline language models via Ollama. It features code snippet detection, a collapsible "thinking" section using <think> tags, and dark mode support, making it easy to view and copy code outputs while maintaining a modern, responsive design.
https://github.com/eternalflame02/ollama-chat-interface
deepseek-r1-distill-qwen ollama ollama-interface python-flask
Last synced: 3 months ago
JSON representation
Ollama-chat-interface is a simple, lightweight interface built with Flask and Bootstrap 5 for interacting with offline language models via Ollama. It features code snippet detection, a collapsible "thinking" section using <think> tags, and dark mode support, making it easy to view and copy code outputs while maintaining a modern, responsive design.
- Host: GitHub
- URL: https://github.com/eternalflame02/ollama-chat-interface
- Owner: eternalflame02
- License: mit
- Created: 2025-02-26T08:41:19.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-02-26T15:55:30.000Z (3 months ago)
- Last Synced: 2025-02-26T16:24:12.527Z (3 months ago)
- Topics: deepseek-r1-distill-qwen, ollama, ollama-interface, python-flask
- Language: HTML
- Homepage:
- Size: 10.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Ollama-chat-interface
## Chatbot UI with Code Snippet Support, Dark Mode, and Ollama Integration
A modern, dynamic chatbot UI built with Flask and Bootstrap 5. This interface supports code snippet detection and separation, dark mode, and a clean, responsive design—all integrated with [Ollama](https://ollama.com) to run offline language models.
> **Note:** Before using this interface, ensure you have Ollama installed and running on your machine (typically listening on `http://127.0.0.1:11434`). For more information, please refer to the [Ollama documentation](https://ollama.com).
---
## Features
- **Modern UI:** Clean and responsive chat interface using Bootstrap 5.
- **Dark Mode:** Toggle dark mode for a better visual experience.
- **Code Snippet Parsing:** Automatically detects code blocks (formatted with triple backticks) and displays them in separate chat bubbles with copy buttons.
- **Collapsible "Thinking" Section:** Use `` and `` tags in the bot's response to display internal reasoning separately in a collapsible block.
- **Loading Indicator:** A spinner is shown while waiting for the chatbot's response.
- **Ollama Integration:** Communicates with a locally running Ollama server to generate responses.
- **Extensible:** A solid foundation to add more features like syntax highlighting, persistent chat history, and user authentication.---
## Future Enhancements
- **Syntax Highlighting:** Integrate libraries like PrismJS or Highlight.js for better code rendering.
- **Persistent Chat History:** Save chat logs using local storage or a backend database.
- **User Authentication:** Implement user accounts for personal chat histories.
- **Real-time Updates:** Use WebSockets (e.g., Flask-SocketIO) for a more dynamic experience.
- **Containerization:** Dockerize the application for easier deployment.---
## Installation
1. **Clone the Repository:**
```bash
git clone https://github.com/eternalflame02/Ollama-chat-interface.git
cd Ollama-chat-interface
```2. **Create and Activate a Virtual Environment:**
```bash
python -m venv venv
# On Windows:
venv\Scripts\activate
# On macOS/Linux:
source venv/bin/activate
```3. **Install Dependencies:**
```bash
pip install -r requirements.txt
```
or
```bash
pip install Flask==3.1.0 requests==2.32.3
```4. **Ensure Ollama is Installed and Running:**
- Download and install Ollama from [https://ollama.com](https://ollama.com).
- Run the Ollama server:
```bash
ollama serve
```
- Ensure the desired model is available:
```bash
ollama list
```
The project now uses the `deepseek-r1:7b` model. Ensure it is downloaded and ready for use.5. **Run the Flask Application:**
```bash
python app.py
```
The application will be available at [http://127.0.0.1:5000](http://127.0.0.1:5000).---
## Project Structure
```
Ollama-chat-interface/
├── app.py # Main Flask application
├── templates/
│ └── index.html # HTML file containing the chatbot UI
├── requirements.txt # Python dependencies
└── README.md # Project README (this file)
```---
## Usage
- Open your browser and navigate to [http://127.0.0.1:5000](http://127.0.0.1:5000).
- Type a message in the chat input and press "Send."
- The chatbot supports:
- **Code Snippets:** Format your code with triple backticks (```) to display it in separate, copyable chat bubbles.
- **Thinking Section:** Wrap any internal reasoning or additional details between `` and `` tags. The content between these tags will appear in a collapsible block below the main answer.
- Ensure that the Ollama server is running locally to generate responses from the chosen offline model.---
## Contributing
Contributions are welcome! Please fork the repository and submit pull requests with your enhancements or bug fixes.
---
## License
This project is licensed under the MIT License.