Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/bayesianinstitute/decentralized-rag
Decentralized RAG with Blockchain
https://github.com/bayesianinstitute/decentralized-rag
blockchain ipfs llm rag
Last synced: 2 days ago
JSON representation
Decentralized RAG with Blockchain
- Host: GitHub
- URL: https://github.com/bayesianinstitute/decentralized-rag
- Owner: bayesianinstitute
- Created: 2024-07-26T21:04:03.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2024-12-12T23:02:44.000Z (11 days ago)
- Last Synced: 2024-12-21T10:25:23.202Z (2 days ago)
- Topics: blockchain, ipfs, llm, rag
- Language: Jupyter Notebook
- Homepage:
- Size: 8 MB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
## Local Decentralized RAG File
This repository houses a demo application for local and decentralized Retrieval Augmented Generation (RAG). The project allows users to interact with both local knowledge bases and contribute to a shared, global knowledge .
### Run in Docker Container
Run all services:
```bash
docker compose up -d
```Download the model and run the application:
```bash
bash run.sh
```### Clone and Build
1. Clone the repository:
```bash
git clone https://github.com/bayesianinstitute/Decentralized-RAG
cd Decentralized-RAG
```2. Build the package:
```bash
python setup.py sdist bdist_wheel
pip install .
```### Setting up Qdrant (Vector Database)
1. **Download Qdrant Image:**
```bash
docker pull qdrant/qdrant
```2. **Run Qdrant:**
```bash
docker run -d -p 6333:6333 -p 6334:6334 \
-v ./qdrant_data:/qdrant/storage \
qdrant/qdrant
```
This command starts a Qdrant instance and maps the necessary ports. It also mounts a local directory (`qdrant_storage`) to persist the database.IN WINDOWS:
docker run -d --name qdrant_container -p 6333:6333 -p 6334:6334 -v C:/Users/faiza/Music/llmResearch/rag/qdrant_data:/qdrant/storage qdrant/qdrant:latest
### Dependencies1. **Ollama:**
- Download and install Ollama by following the instructions on the official website: [https://ollama.ai/](https://ollama.ai/)
2. **Language Model:**
- Choose a language model from the Ollama library ([https://ollama.ai/library](https://ollama.ai/library)) or create your own. Make sure to pull the model using the `ollama pull` command. For example, to pull the "llama3:8b" model:
```bash
ollama pull llama3:8b
```3. **Text Embedding Model:**
```bash
ollama pull nomic-embed-text:latest
```4. **Other Python Libraries:**
- Install any other required Python libraries, likely including:
* `qdrant-client` (to interact with Qdrant)### Running the Application
1. **Configure Node Type:**
- Modify the `main.py` file to specify the desired node type:
* `admin`: Institute Node (manages the global embedding)
* `data`: Data Node (contributes specialized knowledge)2. **Start the Application:**
```bash
python main.py --data-dir data --nodetype admin
```
- Replace `data` with your desired data directory.
- Set `--nodetype` to either `admin` or `data`.## References
Ollama Docker hub: [https://hub.docker.com/r/ollama/ollama](https://hub.docker.com/r/ollama/ollama)
### IPFS
https://docs.ipfs.tech/install/command-line/#system-requirementswget https://dist.ipfs.tech/kubo/v0.23.0/kubo_v0.23.0_windows-amd64.zip -Outfile kubo_v0.23.0.zip
Expand-Archive -Path kubo_v0.23.0.zip -DestinationPath .\kubo
cd .\kubo
.\install.bat