https://github.com/nicolay-r/bulk-chain-shell
Shell client 📺 for shema-based reasoning 🧠 over your data via custom LLM provider 🌌
https://github.com/nicolay-r/bulk-chain-shell
bulk chain-of-thought cot declarative gpt inference llm pipeline reasoning shell sqlite3 ui
Last synced: 4 months ago
JSON representation
Shell client 📺 for shema-based reasoning 🧠 over your data via custom LLM provider 🌌
- Host: GitHub
- URL: https://github.com/nicolay-r/bulk-chain-shell
- Owner: nicolay-r
- License: mit
- Created: 2025-04-09T08:33:33.000Z (6 months ago)
- Default Branch: master
- Last Pushed: 2025-04-09T15:15:03.000Z (6 months ago)
- Last Synced: 2025-04-19T16:23:03.857Z (6 months ago)
- Topics: bulk, chain-of-thought, cot, declarative, gpt, inference, llm, pipeline, reasoning, shell, sqlite3, ui
- Language: Python
- Homepage: https://github.com/nicolay-r/bulk-chain
- Size: 21.5 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# bulk-chain-shell 0.25.0

Third-party providers hosting↗️
This project is a [bash-shell client](https://en.wikipedia.org/wiki/Bash_(Unix_shell)) for [bulk-chain](https://github.com/nicolay-r/bulk-chain) no-string framework for reasoning over your data using predefined [CoT prompt schema](https://github.com/nicolay-r/bulk-chain?tab=readme-ov-file#chain-of-thought-schema).
![]()
> 📺 This video showcase application of the [↗️ Sentiment Analysis Schema](https://github.com/nicolay-r/bulk-chain/blob/master/test/schema/thor_cot_schema.json) towards [LLaMA-3-70B-Instruct](https://replicate.com/meta/meta-llama-3-70b-instruct) hosted by Replicate for reasoning over submitted texts
### Extra Features
* ✅ **Progress caching [for remote LLMs]**: withstanding exception during LLM calls by using `sqlite3` engine for caching LLM answers;# Installation
```bash
pip install git+https://github.com/nicolay-r/bulk-chain-shell@master
```# Usage
To interact with LLM via command line with LLM output streaming support.
The video below illustrates an example of application for sentiment analysis on author opinion extraction towards mentioned object in text.## Interactive Mode
Quick start :
1. ⬇️ Download [replicate](https://replicate.com/) provider for `bulk-chain`:
2. 📜 Setup your reasoning `thor_cot_schema.json` according to the [following example ↗️](test/schema/thor_cot_schema.json)
3. 🚀 Launch `demo.py` as follows:
```bash
python3 -m bulk_chain_shell.demo \
--schema "test/schema/thor_cot_schema.json" \
--adapter "dynamic:replicate_104.py:Replicate" \
%%m \
--model_name "meta/meta-llama-3-70b-instruct" \
--api_token "" \
--stream
```## Inference Mode
> **NOTE:** You have to install `source-iter` and `tqdm` packages that actual [dependencies](dependencies.txt) of this project
1. ⬇️ Download [replicate](https://replicate.com/) provider for `bulk-chain`:
```bash
wget https://raw.githubusercontent.com/nicolay-r/nlp-thirdgate/refs/heads/master/llm/replicate_104.py
```
2. 📜 Setup your reasoning `schema.json` according to the [following example ↗️](test/schema/default.json)
3. 🚀 Launch inference using `DeepSeek-R1`:
```bash
python3 -m bulk_chain_shell.infer \
--src "" \
--schema "test/schema/default.json" \
--adapter "replicate_104.py:Replicate" \
%%m \
--model_name "deepseek-ai/deepseek-r1" \
--api_token ""
```# Embed your LLM
All you have to do is to implement `BaseLM` class, that includes:
* `__init__` -- for setting up *batching mode support* and (optional) *model name*;
* `ask(prompt)` -- infer your model with the given `prompt`.See examples with models [at nlp-thirdgate 🌌](https://github.com/nicolay-r/nlp-thirdgate?tab=readme-ov-file#llm).
# References
* bulk-chain no-string API: https://github.com/nicolay-r/bulk-chain
* Third-party providers hosting: https://github.com/nicolay-r/nlp-thirdgate?tab=readme-ov-file#llm