{"id":22561569,"url":"https://github.com/adhikasp/mcp-client-cli","last_synced_at":"2025-05-16T12:12:20.472Z","repository":{"id":265101438,"uuid":"895079444","full_name":"adhikasp/mcp-client-cli","owner":"adhikasp","description":"A simple CLI to run LLM prompt and implement MCP client.","archived":false,"fork":false,"pushed_at":"2025-05-01T06:33:25.000Z","size":304,"stargazers_count":432,"open_issues_count":14,"forks_count":56,"subscribers_count":3,"default_branch":"master","last_synced_at":"2025-05-01T07:20:46.280Z","etag":null,"topics":["langchain","llm","mcp","model-context-protocol"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/adhikasp.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-11-27T14:21:00.000Z","updated_at":"2025-05-01T06:32:54.000Z","dependencies_parsed_at":"2024-11-29T19:36:39.336Z","dependency_job_id":"e6499bf9-08a7-401d-bb9f-29b32c9a2781","html_url":"https://github.com/adhikasp/mcp-client-cli","commit_stats":null,"previous_names":["adhikasp/mcp-exploration","adhikasp/mcp-client-cli"],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhikasp%2Fmcp-client-cli","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhikasp%2Fmcp-client-cli/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhikasp%2Fmcp-client-cli/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/adhikasp%2Fmcp-client-cli/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/adhikasp","download_url":"https://codeload.github.com/adhikasp/mcp-client-cli/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":254527099,"owners_count":22085919,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["langchain","llm","mcp","model-context-protocol"],"created_at":"2024-12-07T22:07:58.966Z","updated_at":"2025-05-16T12:12:20.449Z","avatar_url":"https://github.com/adhikasp.png","language":"Python","readme":"# MCP CLI client\n\nA simple CLI program to run LLM prompt and implement [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) client.\n\nYou can use any [MCP-compatible servers](https://github.com/punkpeye/awesome-mcp-servers) from the convenience of your terminal.\n\nThis act as alternative client beside Claude Desktop. Additionally you can use any LLM provider like OpenAI, Groq, or local LLM model via [llama](https://github.com/ggerganov/llama.cpp).\n\n![C4 Diagram](https://raw.githubusercontent.com/adhikasp/mcp-client-cli/refs/heads/master/c4_diagram.png)\n\n## Setup\n\n1. Install via pip:\n   ```bash\n   pip install mcp-client-cli\n   ```\n\n2. Create a `~/.llm/config.json` file to configure your LLM and MCP servers:\n   ```json\n   {\n     \"systemPrompt\": \"You are an AI assistant helping a software engineer...\",\n     \"llm\": {\n       \"provider\": \"openai\",\n       \"model\": \"gpt-4\",\n       \"api_key\": \"your-openai-api-key\",\n       \"temperature\": 0.7,\n       \"base_url\": \"https://api.openai.com/v1\"  // Optional, for OpenRouter or other providers\n     },\n     \"mcpServers\": {\n       \"fetch\": {\n         \"command\": \"uvx\",\n         \"args\": [\"mcp-server-fetch\"],\n         \"requires_confirmation\": [\"fetch\"],\n         \"enabled\": true,  // Optional, defaults to true\n         \"exclude_tools\": []  // Optional, list of tool names to exclude\n       },\n       \"brave-search\": {\n         \"command\": \"npx\",\n         \"args\": [\"-y\", \"@modelcontextprotocol/server-brave-search\"],\n         \"env\": {\n           \"BRAVE_API_KEY\": \"your-brave-api-key\"\n         },\n         \"requires_confirmation\": [\"brave_web_search\"]\n       },\n       \"youtube\": {\n         \"command\": \"uvx\",\n         \"args\": [\"--from\", \"git+https://github.com/adhikasp/mcp-youtube\", \"mcp-youtube\"]\n       }\n     }\n   }\n   ```\n\n   Note:\n   - See [CONFIG.md](CONFIG.md) for complete documentation of the configuration format\n   - Use `requires_confirmation` to specify which tools need user confirmation before execution\n   - The LLM API key can also be set via environment variables `LLM_API_KEY` or `OPENAI_API_KEY`\n   - The config file can be placed in either `~/.llm/config.json` or `$PWD/.llm/config.json`\n   - You can comment the JSON config file with `//` if you like to switch around the configuration\n\n3. Run the CLI:\n   ```bash\n   llm \"What is the capital city of North Sumatra?\"\n   ```\n\n## Usage\n\n### Basic Usage\n\n```bash\n$ llm What is the capital city of North Sumatra?\nThe capital city of North Sumatra is Medan.\n```\n\nYou can omit the quotes, but be careful with bash special characters like `\u0026`, `|`, `;` that might be interpreted by your shell.\n\nYou can also pipe input from other commands or files:\n\n```bash\n$ echo \"What is the capital city of North Sumatra?\" | llm\nThe capital city of North Sumatra is Medan.\n\n$ echo \"Given a location, tell me its capital city.\" \u003e instructions.txt\n$ cat instruction.txt | llm \"West Java\"\nThe capital city of West Java is Bandung.\n```\n\n### Image Input\n\nYou can pipe image files to analyze them with multimodal LLMs:\n\n```bash\n$ cat image.jpg | llm \"What do you see in this image?\"\n[LLM will analyze and describe the image]\n\n$ cat screenshot.png | llm \"Is there any error in this screenshot?\"\n[LLM will analyze the screenshot and point out any errors]\n```\n\n### Using Prompt Templates\n\nYou can use predefined prompt templates by using the `p` prefix followed by the template name and its arguments:\n\n```bash\n# List available prompt templates\n$ llm --list-prompts\n\n# Use a template\n$ llm p review  # Review git changes\n$ llm p commit  # Generate commit message\n$ llm p yt url=https://youtube.com/...  # Summarize YouTube video\n```\n\n### Triggering a tool\n\n```bash\n$ llm What is the top article on hackernews today?\n\n================================== Ai Message ==================================\nTool Calls:\n  brave_web_search (call_eXmFQizLUp8TKBgPtgFo71et)\n Call ID: call_eXmFQizLUp8TKBgPtgFo71et\n  Args:\n    query: site:news.ycombinator.com\n    count: 1\nBrave Search MCP Server running on stdio\n\n# If the tool requires confirmation, you'll be prompted:\nConfirm tool call? [y/n]: y\n\n================================== Ai Message ==================================\nTool Calls:\n  fetch (call_xH32S0QKqMfudgN1ZGV6vH1P)\n Call ID: call_xH32S0QKqMfudgN1ZGV6vH1P\n  Args:\n    url: https://news.ycombinator.com/\n================================= Tool Message =================================\nName: fetch\n\n[TextContent(type='text', text='Contents [REDACTED]]\n================================== Ai Message ==================================\n\nThe top article on Hacker News today is:\n\n### [Why pipes sometimes get \"stuck\": buffering](https://jvns.ca)\n- **Points:** 31\n- **Posted by:** tanelpoder\n- **Posted:** 1 hour ago\n\nYou can view the full list of articles on [Hacker News](https://news.ycombinator.com/)\n```\n\nTo bypass tool confirmation requirements, use the `--no-confirmations` flag:\n\n```bash\n$ llm --no-confirmations \"What is the top article on hackernews today?\"\n```\n\nTo use in bash scripts, add the --no-intermediates, so it doesn't print intermediate messages, only the concluding end message.\n```bash\n$ llm --no-intermediates \"What is the time in Tokyo right now?\"\n```\n\n### Continuation\n\nAdd a `c ` prefix to your message to continue the last conversation.\n\n```bash\n$ llm asldkfjasdfkl\nIt seems like your message might have been a typo or an error. Could you please clarify or provide more details about what you need help with?\n$ llm c what did i say previously?\nYou previously typed \"asldkfjasdfkl,\" which appears to be a random string of characters. If you meant to ask something specific or if you have a question, please let me know!\n```\n\n### Clipboard Support\n\nYou can use content from your clipboard using the `cb` command:\n\n```bash\n# After copying text to clipboard\n$ llm cb\n[LLM will process the clipboard text]\n\n$ llm cb \"What language is this code written in?\"\n[LLM will analyze the clipboard text with your question]\n\n# After copying an image to clipboard\n$ llm cb \"What do you see in this image?\"\n[LLM will analyze the clipboard image]\n\n# You can combine it with continuation\n$ llm cb c \"Tell me more about what you see\"\n[LLM will continue the conversation about the clipboard content]\n```\n\nThe clipboard feature works in:\n- Native Windows/macOS/Linux environments\n  - Windows: Uses PowerShell\n  - macOS: Uses `pbpaste` for text, `pngpaste` for images (optional)\n  - Linux: Uses `xclip` (required for clipboard support)\n- Windows Subsystem for Linux (WSL)\n  - Accesses the Windows clipboard through PowerShell\n  - Works with both text and images\n  - Make sure you have access to `powershell.exe` from WSL\n\nRequired tools for clipboard support:\n- Windows: PowerShell (built-in)\n- macOS: \n  - `pbpaste` (built-in) for text\n  - `pngpaste` (optional) for images: `brew install pngpaste`\n- Linux: \n  - `xclip`: `sudo apt install xclip` or equivalent\n\nThe CLI automatically detects if the clipboard content is text or image and handles it appropriately.\n\n### Additional Options\n\n```bash\n$ llm --list-tools                # List all available tools\n$ llm --list-prompts              # List available prompt templates\n$ llm --no-tools                  # Run without any tools\n$ llm --force-refresh             # Force refresh tool capabilities cache\n$ llm --text-only                 # Output raw text without markdown formatting\n$ llm --show-memories             # Show user memories\n$ llm --model gpt-4               # Override the model specified in config\n```\n\n## Contributing\n\nFeel free to submit issues and pull requests for improvements or bug fixes.\n","funding_links":[],"categories":["📚 Projects (1974 total)","🤖 AI/ML","MCP Clients","Example usage"],"sub_categories":["MCP Servers","CLI Tools","🖲️ Command Line Interfaces","Manual Installation"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fadhikasp%2Fmcp-client-cli","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fadhikasp%2Fmcp-client-cli","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fadhikasp%2Fmcp-client-cli/lists"}