{"id":22704944,"url":"https://github.com/ianva/llm-bp","last_synced_at":"2026-04-19T02:07:53.805Z","repository":{"id":264564745,"uuid":"893699491","full_name":"ianva/llm-bp","owner":"ianva","description":"A CLI tool for batch processing files with custom LLM prompts. Streamline your workflow by applying any prompt to multiple files concurrently.","archived":false,"fork":false,"pushed_at":"2024-11-26T18:08:29.000Z","size":27,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-02-04T21:17:12.916Z","etag":null,"topics":["ai","batching","bun","cli","llm"],"latest_commit_sha":null,"homepage":"","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ianva.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-11-25T03:42:29.000Z","updated_at":"2024-11-26T18:08:32.000Z","dependencies_parsed_at":"2024-11-25T04:27:36.521Z","dependency_job_id":null,"html_url":"https://github.com/ianva/llm-bp","commit_stats":null,"previous_names":["ianva/llm-bp"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ianva%2Fllm-bp","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ianva%2Fllm-bp/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ianva%2Fllm-bp/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ianva%2Fllm-bp/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ianva","download_url":"https://codeload.github.com/ianva/llm-bp/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":246238700,"owners_count":20745581,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","batching","bun","cli","llm"],"created_at":"2024-12-10T09:08:29.289Z","updated_at":"2026-04-19T02:07:53.760Z","avatar_url":"https://github.com/ianva.png","language":"JavaScript","funding_links":[],"categories":[],"sub_categories":[],"readme":"# LLM Batch Processor (llm-bp)\n\nA CLI tool for batch processing files using LLM prompts. Built with TypeScript and OpenAI's API.\n\n## Features\n\n- 🚀 Process text using OpenAI's GPT models\n- 📝 Support both file input and pipeline input\n- 🔄 Flexible output options (file or stdout)\n- 📋 Use prompt strings or prompt files\n- 🌐 Configurable OpenAI API settings\n\n## Installation\n\n1. Clone and install dependencies:\n   ```bash\n   git clone https://github.com/ianva/llm-bp.git\n   cd llm-bp\n   bun install\n   ```\n\n2. Build the project:\n   ```bash\n   bun run build\n   ```\n\n3. Make `lmb` available:\n   ```bash\n   # Create user bin directory if it doesn't exist\n   mkdir -p ~/bin\n   \n   # Create symlink\n   ln -s \"$(pwd)/dist/index.js\" ~/bin/lmb\n   \n   # Add to PATH (add to your ~/.zshrc or ~/.bashrc)\n   export PATH=\"$HOME/bin:$PATH\"\n   ```\n\n## Configuration\n\n1. Set up environment variables:\n   ```bash\n   cp .env.example .env\n   ```\n\n2. Edit `.env` with your settings:\n   ```bash\n   # OpenAI API Configuration\n   OPENAI_API_KEY=your-api-key-here\n   OPENAI_BASE_URL=https://api.openai.com/v1\n   OPENAI_MODEL=gpt-4  # or another available model\n   ```\n\n## Usage\n\n### Basic Usage\n\n1. Process file and print to stdout:\n   ```bash\n   # Using prompt string\n   lmb input.txt -p \"Summarize this text\"\n   \n   # Using prompt file\n   lmb input.txt -f prompts/summarize.txt\n   ```\n\n2. Process file and save to directory:\n   ```bash\n   # Using prompt string\n   lmb input.txt ./output -p \"Summarize this text\"\n   \n   # Using prompt file\n   lmb input.txt ./output -f prompts/summarize.txt\n   ```\n\n### Pipeline Usage\n\n1. Process pipeline input and print to stdout:\n   ```bash\n   # Using prompt string\n   echo \"Hello, world\" | lmb -p \"Translate to Chinese\"\n   \n   # Using prompt file\n   echo \"Hello, world\" | lmb -f prompts/translate_to_chinese.txt\n   ```\n\n2. Process pipeline input and save to file:\n   ```bash\n   # Using prompt string\n   echo \"Hello, world\" | lmb \"\" ./output -p \"Translate to Chinese\"\n   \n   # Using prompt file\n   echo \"Hello, world\" | lmb \"\" ./output -f prompts/translate_to_chinese.txt\n   ```\n\n### Command Options\n\n```bash\nUsage: lmb [input] [output] [options]\n\nArguments:\n  input   Input file (optional for pipeline input)\n  output  Output directory (optional, prints to stdout if not specified)\n\nOptions:\n  -p, --prompt \u003cstring\u003e  Prompt string\n  -f, --prompt-file \u003cfile\u003e  Prompt file\n  -h, --help  Display help\n```\n\n## Example Prompt Files\n\nCreate prompt files in the `prompts` directory:\n\n1. Translation prompt (`prompts/translate_to_chinese.txt`):\n   ```\n   You are a professional, authentic machine translation engine.\n   Translate the following text to Chinese\n   ```\n\n2. Code review prompt (`prompts/code_review.txt`):\n   ```\n   Review the following code and provide:\n   1. Potential bugs or issues\n   2. Suggestions for improvement\n   3. Best practices that could be applied\n   ```\n\n## Tips\n\n- For consistent results, use a lower temperature setting in your prompts\n- Create reusable prompt files for common tasks\n- Use stdout for quick tasks and file output for batch processing\n- Pipe output to other commands for further processing:\n  ```bash\n  echo \"Hello\" | lmb -p \"Translate to Chinese\" | pbcopy  # Copy to clipboard\n  ```\n\n## License\n\nMIT\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fianva%2Fllm-bp","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fianva%2Fllm-bp","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fianva%2Fllm-bp/lists"}