{"id":27425759,"url":"https://github.com/bikz/gdot-ai-commit","last_synced_at":"2025-06-17T22:33:56.476Z","repository":{"id":287228115,"uuid":"964047036","full_name":"Bikz/gdot-ai-commit","owner":"Bikz","description":"A lightning-fast utility for Git that stages, commits with AI-generated messages, and pushes—all with one simple command: g.","archived":false,"fork":false,"pushed_at":"2025-04-12T17:22:18.000Z","size":126,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-04-14T12:47:55.999Z","etag":null,"topics":["ai-git-commit","auto-commit","commit-message","fast-commit","git-automation","llama3","ollama","one-command-git-commit","qwen"],"latest_commit_sha":null,"homepage":"","language":"Shell","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"gpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Bikz.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":".github/CODEOWNERS","security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-04-10T15:54:30.000Z","updated_at":"2025-04-12T17:22:21.000Z","dependencies_parsed_at":"2025-06-17T22:32:48.101Z","dependency_job_id":null,"html_url":"https://github.com/Bikz/gdot-ai-commit","commit_stats":null,"previous_names":["bikz/git-ai-commit","bikz/gdot-ai-commit"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/Bikz/gdot-ai-commit","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bikz%2Fgdot-ai-commit","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bikz%2Fgdot-ai-commit/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bikz%2Fgdot-ai-commit/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bikz%2Fgdot-ai-commit/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Bikz","download_url":"https://codeload.github.com/Bikz/gdot-ai-commit/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Bikz%2Fgdot-ai-commit/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":260450756,"owners_count":23011139,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai-git-commit","auto-commit","commit-message","fast-commit","git-automation","llama3","ollama","one-command-git-commit","qwen"],"created_at":"2025-04-14T12:28:47.235Z","updated_at":"2025-06-17T22:33:51.445Z","avatar_url":"https://github.com/Bikz.png","language":"Shell","readme":"# gDot-ai-commit (g.)\n\nA lightning-fast utility for Git that stages, commits with AI-generated messages, and pushes—all with one simple command: `g.`\n\n## Features\n\n- **Ultra-fast workflow**: Stage, commit, and push with a single command.\n- **AI-powered commit messages**: Uses Ollama with the lightweight qwen2.5-coder:1.5b model (~1GB).\n- **Privacy-focused**: All processing happens on your machine.\n- **Minimal keystrokes**: Just type `g.` and you're done.\n- **Works with your flow**: Optionally provide your own commit message.\n- **Clean, informative output**: Provides clear feedback at each step of the process.\n- **Automatic Update Notifications**: Checks daily for new versions and notifies you.\n- **Simple Manual Update**: Use `g. --update` to get the latest version anytime.\n\n## Prerequisites\n\n1. **Git**: Must be installed and configured.\n2. **Ollama**: Required for AI generation. The installer will check if Ollama is installed and guide you if not. Get it from [Ollama Website](https://ollama.ai).\n3. **An Ollama Model**: The script defaults to `qwen2.5-coder:1.5b` (a small ~1GB model optimized for code). The installer will check if this model is available and prompt you to pull it if it's missing (`ollama pull qwen2.5-coder:1.5b`).\n4. **`curl`**: Needed for the one-line installer (usually pre-installed on macOS/Linux).\n5. **`jq`**: Needed by the `g.` script to function reliably (install via `brew install jq`, `sudo apt install jq`, etc.). The script will error if `jq` is missing.\n\n## Installation\n\n### Option 1: One-line installer\n\n```bash\ncurl -s https://raw.githubusercontent.com/Bikz/gDot-ai-commit/main/install.sh | bash\n```\n\n### Option 2: Manual installation\n\n1. **Create the directory if needed:**\n\n```bash\nmkdir -p ~/.local/bin\n```\n\n1. **Download the script:**\n\n    ```bash\n    curl -s https://raw.githubusercontent.com/Bikz/gDot-ai-commit/main/g -o ~/.local/bin/g.\n    ```\n\n2. **Make it executable:**\n\n    ```bash\n    chmod +x ~/.local/bin/g.\n    ```\n\n3. **Ensure `~/.local/bin` is in your PATH:**\n\n    Check with `echo $PATH`. If it's not listed, add it to your shell configuration file (e.g., `~/.bashrc`, `~/.zshrc`, `~/.profile`, `~/.config/fish/config.fish`). Add a line like this:\n\n    ```bash\n    export PATH=\"$HOME/.local/bin:$PATH\"\n    ```\n\n    Then, restart your terminal or source the config file (e.g., `source ~/.zshrc`).\n\n## Usage\n\n```bash\n# Auto-commit with AI-generated message (uses default model 'qwen2.5-coder:1.5b')\ng.\n\n# Use your own commit message instead of AI generation\ng. \"fix: resolved authentication issue in login form\"\n```\n\n## Updating\n\n`gDot-ai-commit` includes a built-in mechanism to check for updates daily.\n\n- **Automatic Check:** Once a day, the script will automatically check GitHub for a newer version. If one is found, it will print a notification suggesting you update.\n- **Manual Update:** To manually trigger an update at any time, run:\n\n```bash\ng. --update\n```\n\nThis command will download the latest version of the g. script and replace your current one. You might need to restart your terminal session or run `hash -r` for the changes to take effect immediately.\n\n## Configuration\n\nYou can override defaults using environment variables before running the script (e.g., `GAC_MODEL=mistral g.`) or by editing the `g.` script file (`~/.local/bin/g.`) directly:\n\n- **`MODEL`**: The Ollama model to use (default: \"qwen2.5-coder:1.5b\"). Change this if you prefer another model (e.g., \"llama3\", \"mistral\", \"codegemma\"). Make sure you pull it first (`ollama pull \u003cmodel_name\u003e`).\n- **`OLLAMA_ENDPOINT`**: The URL for the Ollama API (default: \"\u003chttp://localhost:11434/api/chat\u003e\").\n- **`TEMP`**: Temperature setting for generation (default: 0.2).\n\n## How it works\n\nAfter you enter `g.` in your terminal, this utility will automatically:\n\n1. Stage all changes (`git add .`).\n2. Get the diff information (`git diff --staged`).\n3. If no message is provided as an argument, generate a commit message based on the diff using Ollama via its API.\n4. Commit with the generated or provided message (`git commit -m \"...\"`).\n5. Push to the appropriate remote and branch (`git push`).\n\n## Troubleshooting\n\n- **\"Command not found: g.\"**: Ensure the installation directory (`~/.local/bin`) is correctly added to your `$PATH` environment variable and you've restarted your terminal or sourced your shell profile.\n- **\"Error: 'jq' command not found...\"**: Install `jq` using your system's package manager (e.g., `brew install jq` on macOS, `sudo apt install jq` on Debian/Ubuntu). The script requires `jq` for reliable operation.\n- **\"Error: 'ollama' command not found\"**: Install Ollama from [Ollama Website](https://ollama.ai).\n- **\"Error: Failed to communicate with Ollama API...\"**: Make sure the Ollama application or service is running (`ollama ps` or check system services). Check if the `OLLAMA_ENDPOINT` in the script is correct.\n- **\"Error: Ollama API returned an error: model '...' not found\"**: Ensure the model specified by the `MODEL` variable in the script (or `GAC_MODEL` env var) has been pulled (`ollama pull \u003cmodel_name\u003e`) and is listed in `ollama list`.\n\n## Other Issues\n\nIf you encounter any bugs or still facing other issues, please open an issue on [GitHub Issues](https://github.com/Bikz/gDot-ai-commit/issues)\n\n## Uninstalling\n\n### Uninstalling g. Script\n\n1. Remove the script file:\n\n    ```bash\n    rm ~/.local/bin/g.\n    ```\n\n2. (Optional) Remove from PATH if needed:\n\n    ```bash\n    # Edit your shell config file (~/.bashrc or ~/.zshrc) and remove/comment out this line:\n    export PATH=\"$HOME/.local/bin:$PATH\"\n\n    # Then reload your shell configuration\n    source ~/.bashrc  # or source ~/.zshrc\n    ```\n\n### Uninstalling Ollama\n\n**On macOS:**\n\n```bash\n# Stop Ollama service/app (adapt if run manually)\nlaunchctl unload ~/Library/LaunchAgents/com.ollama.ollama.plist 2\u003e/dev/null\nps aux | grep Ollama | grep -v grep | awk '{print $2}' | xargs kill 2\u003e/dev/null\n\n# Remove Application and CLI tool\nrm -rf /Applications/Ollama.app\nrm /usr/local/bin/ollama 2\u003e/dev/null\nrm /opt/homebrew/bin/ollama 2\u003e/dev/null # If installed via Homebrew\n\n# Remove data and models (WARNING: This deletes all pulled models)\nrm -rf ~/.ollama\n\n# Remove launch agent config\nrm ~/Library/LaunchAgents/com.ollama.ollama.plist 2\u003e/dev/null\nlaunchctl remove com.ollama.ollama 2\u003e/dev/null\n```\n\n**On Linux:**\n\n```bash\n# Stop Ollama service (if using systemd)\nsudo systemctl stop ollama 2\u003e/dev/null\nsudo systemctl disable ollama 2\u003e/dev/null\n\n# Remove CLI tool\nsudo rm /usr/local/bin/ollama 2\u003e/dev/null\nsudo rm /usr/bin/ollama 2\u003e/dev/null\n\n# Remove data and models (WARNING: This deletes all pulled models)\nrm -rf ~/.ollama\n\n# Remove systemd service file\nsudo rm /etc/systemd/system/ollama.service 2\u003e/dev/null\nsudo systemctl daemon-reload 2\u003e/dev/null\n```\n\n## Contributing\n\nContributions welcome! Please feel free to submit a Pull Request to [GitHub Repository](https://github.com/Bikz/gDot-ai-commit).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbikz%2Fgdot-ai-commit","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbikz%2Fgdot-ai-commit","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbikz%2Fgdot-ai-commit/lists"}