https://github.com/evg4b/donkey
🫏 A small utility for batch file rpecessing using AI
https://github.com/evg4b/donkey
ai cli ollama-client processing utility
Last synced: 6 months ago
JSON representation
🫏 A small utility for batch file rpecessing using AI
- Host: GitHub
- URL: https://github.com/evg4b/donkey
- Owner: evg4b
- License: gpl-3.0
- Created: 2024-11-30T05:21:33.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-12-21T14:43:36.000Z (10 months ago)
- Last Synced: 2025-02-12T18:36:30.241Z (8 months ago)
- Topics: ai, cli, ollama-client, processing, utility
- Language: Go
- Homepage:
- Size: 31.3 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
Donkey 🫏
![]()
![]()
![]()
A small utility for batch file rpecessing using AI.## Get started
#### Prerequisites
- Install [Ollama](https://ollama.com/)
- Download model for processing `ollama pull mistral-small` (you can change model in `~/.donkey.toml`)Then you can install the application in one of the following ways:
### [Homebrew](https://brew.sh/) (macOS | Linux)
```bash
brew install evg4b/tap/donkey
```### [Scoop](https://scoop.sh/) (Windows)
```bash
scoop bucket add evg4b https://github.com/evg4b/scoop-bucket.git
scoop install evg4b/donkey
```### [NPM](https://npmjs.com) (Cross-platform)
```bash
npx -y @evg4b/donkey ...
```### [Stew](https://github.com/marwanhawari/stew) (Cross-platform)
```bash
stew install evg4b/donkey
```### Binary (Cross-platform)
Download the appropriate version for your platform from [donkey releases page](https://github.com/evg4b/donkey/releases/latest). Once downloaded, the binary can be run from anywhere. You don’t need to install it into a global location. This works well for shared hosts and other systems where you don’t have a privileged account.
Ideally, you should install it somewhere in your `PATH` for easy use. `/usr/local/bin` is the most probable location.
> [!Caution]
>
> This program is a simple utility for batch processing of files using AI.
> The final result depends on the model used and your request.
> By running it, you take responsibility for all changes that were made to your file system.