Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ianva/llm-bp
A CLI tool for batch processing files with custom LLM prompts. Streamline your workflow by applying any prompt to multiple files concurrently.
https://github.com/ianva/llm-bp
ai batching bun cli llm
Last synced: about 19 hours ago
JSON representation
A CLI tool for batch processing files with custom LLM prompts. Streamline your workflow by applying any prompt to multiple files concurrently.
- Host: GitHub
- URL: https://github.com/ianva/llm-bp
- Owner: ianva
- Created: 2024-11-25T03:42:29.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2024-11-26T18:08:29.000Z (2 months ago)
- Last Synced: 2024-12-10T09:08:27.067Z (about 2 months ago)
- Topics: ai, batching, bun, cli, llm
- Language: JavaScript
- Homepage:
- Size: 26.4 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# LLM Batch Processor (llm-bp)
A CLI tool for batch processing files using LLM prompts. Built with TypeScript and OpenAI's API.
## Features
- 🚀 Process text using OpenAI's GPT models
- 📝 Support both file input and pipeline input
- 🔄 Flexible output options (file or stdout)
- 📋 Use prompt strings or prompt files
- 🌐 Configurable OpenAI API settings## Installation
1. Clone and install dependencies:
```bash
git clone https://github.com/ianva/llm-bp.git
cd llm-bp
bun install
```2. Build the project:
```bash
bun run build
```3. Make `lmb` available:
```bash
# Create user bin directory if it doesn't exist
mkdir -p ~/bin
# Create symlink
ln -s "$(pwd)/dist/index.js" ~/bin/lmb
# Add to PATH (add to your ~/.zshrc or ~/.bashrc)
export PATH="$HOME/bin:$PATH"
```## Configuration
1. Set up environment variables:
```bash
cp .env.example .env
```2. Edit `.env` with your settings:
```bash
# OpenAI API Configuration
OPENAI_API_KEY=your-api-key-here
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_MODEL=gpt-4 # or another available model
```## Usage
### Basic Usage
1. Process file and print to stdout:
```bash
# Using prompt string
lmb input.txt -p "Summarize this text"
# Using prompt file
lmb input.txt -f prompts/summarize.txt
```2. Process file and save to directory:
```bash
# Using prompt string
lmb input.txt ./output -p "Summarize this text"
# Using prompt file
lmb input.txt ./output -f prompts/summarize.txt
```### Pipeline Usage
1. Process pipeline input and print to stdout:
```bash
# Using prompt string
echo "Hello, world" | lmb -p "Translate to Chinese"
# Using prompt file
echo "Hello, world" | lmb -f prompts/translate_to_chinese.txt
```2. Process pipeline input and save to file:
```bash
# Using prompt string
echo "Hello, world" | lmb "" ./output -p "Translate to Chinese"
# Using prompt file
echo "Hello, world" | lmb "" ./output -f prompts/translate_to_chinese.txt
```### Command Options
```bash
Usage: lmb [input] [output] [options]Arguments:
input Input file (optional for pipeline input)
output Output directory (optional, prints to stdout if not specified)Options:
-p, --prompt Prompt string
-f, --prompt-file Prompt file
-h, --help Display help
```## Example Prompt Files
Create prompt files in the `prompts` directory:
1. Translation prompt (`prompts/translate_to_chinese.txt`):
```
You are a professional, authentic machine translation engine.
Translate the following text to Chinese
```2. Code review prompt (`prompts/code_review.txt`):
```
Review the following code and provide:
1. Potential bugs or issues
2. Suggestions for improvement
3. Best practices that could be applied
```## Tips
- For consistent results, use a lower temperature setting in your prompts
- Create reusable prompt files for common tasks
- Use stdout for quick tasks and file output for batch processing
- Pipe output to other commands for further processing:
```bash
echo "Hello" | lmb -p "Translate to Chinese" | pbcopy # Copy to clipboard
```## License
MIT