Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/clowa/ollama-timescale-vector
Lab project to build a vector database from local files via ollama
https://github.com/clowa/ollama-timescale-vector
ai ollama timescaledb
Last synced: 14 days ago
JSON representation
Lab project to build a vector database from local files via ollama
- Host: GitHub
- URL: https://github.com/clowa/ollama-timescale-vector
- Owner: clowa
- Created: 2025-01-01T19:02:26.000Z (15 days ago)
- Default Branch: main
- Last Pushed: 2025-01-02T13:05:29.000Z (14 days ago)
- Last Synced: 2025-01-02T14:20:22.947Z (14 days ago)
- Topics: ai, ollama, timescaledb
- Language: Python
- Homepage:
- Size: 8.79 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Overview
A containerized example app that uses LlamaIndex and Ollama to create a vector database.
This project combines several key components:
- Docker for containerization and deployment
- LlamaIndex for document indexing and retrieval
- Ollama for running local LLMs and embeddings
- TimescaleDB with [pgvector](https://github.com/pgvector/pgvector) and [pgai](https://github.com/timescale/pgai) extension as vector storage## Prerequisites
- Docker and Docker Compose
- ~ 2GB disk space for LLM models
- Source documents to index mounted at `/data` in the app container## Quick Start
1. Start the system:
```bash
docker compose up
```This will:
- Start Ollama service. If the required modules haven't been downloaded they will be pulled on first execution. _(llama3.2:1b and nomic-embed-text)_
- Launch TimescaleDB for vector storage
- Start the main application that indexes your documents### Devcontainer
If you are using VSCode, you can use the provided devcontainer to develop the application.
> [!TIP]
> The devcontainer is configured to share the host's ollama modells with the container, so the container doesn't grow huge in size. However, if you want to disable this behavior your can comment out the mount of the `~/.ollama` directory in the `devcontainer.json` file.## Components
- `app` - Python application that handles document processing and indexing