https://github.com/ousseeeef/chat-o-llama
Chat-O-Llama is a user-friendly web interface for managing conversations with Ollama, featuring persistent chat history. Easily set up and start your chat sessions with just a few commands. 🐙💻
https://github.com/ousseeeef/chat-o-llama
ai-chat chat conversation-history cpu-only docker fasttext-embeddings gradio langchain-python lightweight llama3 llama3-meta-ai local-ai mlflow mlflow-tracking offline-ai python qdrant-client qdrant-vector-database
Last synced: 3 months ago
JSON representation
Chat-O-Llama is a user-friendly web interface for managing conversations with Ollama, featuring persistent chat history. Easily set up and start your chat sessions with just a few commands. 🐙💻
- Host: GitHub
- URL: https://github.com/ousseeeef/chat-o-llama
- Owner: ousseeeef
- License: mit
- Created: 2025-06-01T15:18:46.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-06-30T17:07:48.000Z (3 months ago)
- Last Synced: 2025-06-30T18:24:32.034Z (3 months ago)
- Topics: ai-chat, chat, conversation-history, cpu-only, docker, fasttext-embeddings, gradio, langchain-python, lightweight, llama3, llama3-meta-ai, local-ai, mlflow, mlflow-tracking, offline-ai, python, qdrant-client, qdrant-vector-database
- Language: HTML
- Size: 105 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# 🦙 Chat-O-Llama: Your Lightweight AI Chat Solution
Welcome to the **Chat-O-Llama** repository! This project provides a lightweight web interface for Ollama, featuring persistent chat history and a focus on privacy. With Chat-O-Llama, you can self-host your AI chat solution without any external dependencies, making it ideal for local AI development.
[](https://github.com/ousseeeef/chat-o-llama/releases)
## Table of Contents
1. [Features](#features)
2. [Installation](#installation)
3. [Usage](#usage)
4. [Contributing](#contributing)
5. [License](#license)
6. [Contact](#contact)## Features
Chat-O-Llama offers a range of features that enhance your AI chat experience:
- **Lightweight Design**: The interface is simple and easy to navigate.
- **Persistent Chat History**: Your conversations are saved, allowing you to revisit them at any time.
- **Conversation Management**: Easily manage your chat sessions.
- **Search Functionality**: Quickly find previous conversations.
- **Self-Hosted**: Run the application on your local machine without relying on external services.
- **Privacy-Focused**: Your data remains on your machine, ensuring your privacy.
- **Zero External Dependencies**: No need for additional software or services.
- **CPU-Only Support**: Designed to run efficiently on CPU-only setups.## Installation
To get started with Chat-O-Llama, follow these steps:
1. **Clone the Repository**:
```bash
git clone https://github.com/ousseeeef/chat-o-llama.git
```2. **Navigate to the Project Directory**:
```bash
cd chat-o-llama
```3. **Install Dependencies**:
Use pip to install the required packages:
```bash
pip install -r requirements.txt
```4. **Run the Application**:
Start the server with:
```bash
python app.py
```5. **Access the Web Interface**:
Open your web browser and go to `http://127.0.0.1:5000`.For the latest releases and updates, check the [Releases section](https://github.com/ousseeeef/chat-o-llama/releases).
## Usage
Using Chat-O-Llama is straightforward:
- **Start a Conversation**: Simply type your message in the input box and hit enter.
- **View Chat History**: Click on the history button to see your past conversations.
- **Search Conversations**: Use the search bar to find specific messages or topics.### Example Interaction
1. User: "Hello, Chat-O-Llama!"
2. Chat-O-Llama: "Hello! How can I assist you today?"### Tips for Effective Use
- Regularly check your chat history to keep track of important discussions.
- Use the search feature to quickly locate specific topics or questions.## Contributing
We welcome contributions to Chat-O-Llama! If you'd like to contribute, please follow these steps:
1. **Fork the Repository**: Click the "Fork" button at the top right of the page.
2. **Create a New Branch**:
```bash
git checkout -b feature/YourFeatureName
```
3. **Make Your Changes**: Edit the code as needed.
4. **Commit Your Changes**:
```bash
git commit -m "Add your message here"
```
5. **Push to Your Branch**:
```bash
git push origin feature/YourFeatureName
```
6. **Create a Pull Request**: Go to the original repository and click on "New Pull Request".## License
Chat-O-Llama is licensed under the MIT License. Feel free to use, modify, and distribute this project as you see fit.
## Contact
For any questions or suggestions, feel free to reach out:
- GitHub: [oussseeef](https://github.com/ousseeeef)
- Email: your.email@example.comThank you for checking out Chat-O-Llama! We hope you enjoy using this lightweight, privacy-focused AI chat solution. Don't forget to visit the [Releases section](https://github.com/ousseeeef/chat-o-llama/releases) for updates and new features!