Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/axemt/que
Fast LLM question answering + document indexing RAG in the command line
https://github.com/axemt/que
chromadb cli llamacpp question-answering rag
Last synced: 7 days ago
JSON representation
Fast LLM question answering + document indexing RAG in the command line
- Host: GitHub
- URL: https://github.com/axemt/que
- Owner: Axemt
- License: mit
- Created: 2024-04-11T08:33:01.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2024-11-13T15:03:25.000Z (about 2 months ago)
- Last Synced: 2024-12-22T06:40:10.995Z (7 days ago)
- Topics: chromadb, cli, llamacpp, question-answering, rag
- Language: Python
- Homepage:
- Size: 219 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Que
LLM question answering + local document vector storage.
## How it works
The `que` command line utility recursively indexes the documents in any directory you invoke it in, and stores them in a vector database (ChromaDB) in `~/.config/que/index.chroma`. Then it uses cosine similarity to find related texts based on your query, feeds them as context to a Llama model and has it answer the question.
On following invocations, `que` re-checks if the indexed files have changed or have been deleted, or if new documents are present, and updates the internal vector database, avoiding an expensive re-indexing of files.
`que` relies on `llama-cpp`, a fast inference implementation compatible with MPS, CUDA and Vulkan.
## See it in action:
![example](example.jpg)