{"id":24668775,"url":"https://github.com/dheerajcl/Shellsage","last_synced_at":"2025-10-08T05:31:51.838Z","repository":{"id":274191016,"uuid":"922134694","full_name":"dheerajcl/Shellsage","owner":"dheerajcl","description":"An intelligent CLI tool that intercepts terminal errors and provides instant, context-aware solutions using LLM's. Transform natural language into terminal commands and never get stuck on cryptic error messages again! Currently in early development.","archived":false,"fork":false,"pushed_at":"2025-03-17T19:08:50.000Z","size":1305,"stargazers_count":66,"open_issues_count":0,"forks_count":7,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-07-25T18:00:34.852Z","etag":null,"topics":["bash","cli","clitools","developer-tools","linux","ollama","python3","shell-script","terminal"],"latest_commit_sha":null,"homepage":"https://shellsage.vercel.app/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dheerajcl.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE.md","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-01-25T12:11:25.000Z","updated_at":"2025-07-21T00:55:01.000Z","dependencies_parsed_at":"2025-03-17T20:24:06.452Z","dependency_job_id":"343526be-e0b9-4eaf-83fb-d0b930656dc7","html_url":"https://github.com/dheerajcl/Shellsage","commit_stats":null,"previous_names":["dheerajcl/terminal_assistant","dheerajcl/ai_terminal_assistant","dheerajcl/shellsage"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/dheerajcl/Shellsage","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dheerajcl%2FShellsage","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dheerajcl%2FShellsage/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dheerajcl%2FShellsage/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dheerajcl%2FShellsage/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dheerajcl","download_url":"https://codeload.github.com/dheerajcl/Shellsage/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dheerajcl%2FShellsage/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":278892178,"owners_count":26063943,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-08T02:00:06.501Z","response_time":56,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bash","cli","clitools","developer-tools","linux","ollama","python3","shell-script","terminal"],"created_at":"2025-01-26T09:17:26.014Z","updated_at":"2025-10-08T05:31:51.833Z","avatar_url":"https://github.com/dheerajcl.png","language":"Python","readme":"# Shell Sage 🐚✨\n\n**Intelligent Terminal Companion | AI-Powered Terminal Assistant**  \n*(Development Preview - v0.2.0)*\n\n---\n\n## Features\n\n### 🌟 Next-Gen Terminal Experience\n- 🏠 Local AI Support (Ollama) \u0026 Cloud AI (Groq)\n- 🔍 Context-aware error diagnosis\n- 🪄 Natural language to command translation\n- ⚡ Safe command execution workflows\n\n## 🔧 Core Capabilities\n\n### Error Diagnosis\n\n```bash\n# Error analysis example\n$ rm -rf /important-folder\n🔎 Analysis → 🛠️ Fix: `rm -rf ./important-folder`\n```\n![Error Analysis](screenshots/01_up.png)\n\n### Natural Language to Commands\n\n```bash\n# Command generation\n$ shellsage ask \"find large files over 1GB\"\n# → find / -type f -size +1G -exec ls -lh {} \\;\n```\n![Command generation](screenshots/02.png)\n\n### ⚡ Interactive Workflows\n- Confirm before executing generated commands\n- Step-by-step complex operations\n- Safety checks for destructive commands\n\n\n### 🌐 Supported API Providers\n- Groq\n- OpenAI\n- Anthropic \n- Fireworks.ai\n- OpenRouter\n- Deepseek\n\n*Switch providers with `shellsage config --provider \u003cname\u003e`*\n\n---\n\n## Installation\n\n### Prerequisites\n- Python 3.8+\n- (4GB+ recommended for local models)\n\n```bash\n# 1. Clone \u0026 install Shell Sage\ngit clone https://github.com/dheerajcl/Terminal_assistant.git\ncd Terminal_assistant\n./install.sh\n\n# 2. Install Ollama for local AI\ncurl -fsSL https://ollama.com/install.sh | sh\n\n# 3. Get base model (3.8GB) \n#for example\nollama pull llama3:8b-instruct-q4_1\n\n# or API key (Currently supports Groq, OpenAI, Anthropic, Fireworks, OpenRouter, Deepseek)\n# put your desired provider api in .env file \nshellsage config --mode api --provider groq\n\n\n```\n\n### Configuration Notes\n- Rename `.env.example` → `.env` and populate required values\n- API performance varies by provider (Groq fastest, Anthropic most capable)\n- Local models need 4GB+ RAM (llama3:8b) to 16GB+ (llama3:70b)\n- Response quality depends on selected model capabilities\n\n\n### Custom Model Selection\n\nWhile we provide common defaults for each AI provider, many services offer hundreds of models. To use a specific model:\n\n- Check your provider's documentation for available models\n- Set in .env:\n```\nAPI_PROVIDER=openrouter\nAPI_MODEL=your-model-name-here  # e.g. google/gemini-2.0-pro-exp-02-05:free\n\n```\n\n---\n\n\n\n## Configuration\n\n### First-Time Setup\n```bash\n# Interactive configuration wizard\nshellsage setup\n\n? Select operation mode: \n  ▸ Local (Privacy-first, needs 4GB+ RAM) \n    API (Faster but requires internet)\n\n? Choose local model:\n  ▸ llama3:8b-instruct-q4_1 (Recommended)\n    mistral:7b-instruct-v0.3\n    phi3:mini-128k-instruct\n\n# If API mode selected:\n? Choose API provider:\n  ▸ Groq\n    OpenAI\n    Anthropic\n    Fireworks\n    Deepseek\n\n? Enter Groq API key: [hidden input]\n\n? Select Groq model:\n  ▸ mixtral-8x7b-32768       \n    llama3-70b-8192         # It isn't necessary to select models from the shown list, you can add any model of your choice supported by your provider in your .env `API_MODEL=`\n\n✅ API configuration updated!\n\n```\n\n### Runtime Control\n\n```bash\n# Switch modes\nshellsage config --mode api  # or 'local'\n\n\n# Switch to specific model\nshellsage config --mode local --model \u003cmodel_name\u003e\n\n# Interactive switch\nshellsage config --mode local\n? Select local model: \n  ▸ llama3:8b-instruct-q4_1 \n    mistral:7b-instruct-v0.3\n    phi3:mini-128k-instruct\n```\n\n![interactive_flow1](screenshots/03.png)\n\n![interactive_flow2](screenshots/04.png)\n\n---\n\n## Development Status 🚧\n\nShell Sage is currently in **alpha development**.  \n\n**Known Limitations**:\n- Limited Windows support\n- Compatibility issues with zsh, fish\n- Occasional false positives in error detection\n- API mode requires provider-specific key\n\n**Roadmap**:\n- [x] Local LLM support\n- [x] Hybrid cloud(api)/local mode switching\n- [x] Model configuration wizard\n- [ ] Better Context Aware\n- [ ] Windows PowerShell integration\n- [ ] Tmux Integration\n- [ ] CI/CD error pattern database\n\n---\n\n## Contributing\n\nWe welcome contributions! Please follow these steps:\n\n1. Fork the repository\n2. Create feature branch (`git checkout -b feat/amazing-feature`)\n3. Commit changes (`git commit -m 'Add amazing feature'`)\n4. Push to branch (`git push origin feat/amazing-feature`)\n5. Open Pull Request\n\n---\n\n\n\u003e **Note**: This project is not affiliated with any API or model providers.  \n\u003e Local models require adequate system resources.\n\u003e Internet required for initial setup and API mode.  \n\u003e Use at your own risk with critical operations.\n\u003e Always verify commands before execution\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdheerajcl%2FShellsage","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdheerajcl%2FShellsage","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdheerajcl%2FShellsage/lists"}