Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/eduardolagoeiro/dumbledore
https://github.com/eduardolagoeiro/dumbledore
Last synced: 19 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/eduardolagoeiro/dumbledore
- Owner: eduardolagoeiro
- Created: 2024-09-14T15:34:05.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2024-09-15T03:57:35.000Z (5 months ago)
- Last Synced: 2024-11-24T10:26:09.160Z (3 months ago)
- Language: Python
- Size: 13.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Python CLI with Ollama Integration (GPU-enabled)
This project is a Python CLI that interacts with a local Ollama instance to run LLaMA, using a Retrieval-Augmented Generation (RAG) setup. The database stores relevant documents that are used to augment the LLaMA model’s responses.
## Requirements
- Python 3.10
- Docker
- Docker Compose
- Ollama## Setup Instructions
### Step 1: Clone the Repository
```bash
git clone [email protected]:eduardolagoeiro/dumbledore.git
cd dumbledore
```### Step 2: Set up a Python Virtual Environment
Create and activate the virtual environment using Python 3.8:
```bash
python -m venv venv
source venv/bin/activate # On macOS/Linux
# or
venv\Scripts\activate # On Windows
```### Step 3: Install Python Dependencies
```bash
pip install -r requirements.txt
```### Step 3: Run the CLI
Once the environment is set up and the Ollama instance is running with LLaMA installed, you can start the CLI by running:
```bash
python cli.py
```This will open the main menu of the CLI, where you can interact with GitHub repositories or start chatting with the AI model.