{"id":21029381,"url":"https://github.com/knuckles-team/repository-manager","last_synced_at":"2026-03-09T01:01:56.642Z","repository":{"id":57750004,"uuid":"525153003","full_name":"Knuckles-Team/repository-manager","owner":"Knuckles-Team","description":"Manage your git repositories. Now supports Agentic AI MCP Server Integration!","archived":false,"fork":false,"pushed_at":"2026-02-27T13:15:18.000Z","size":2137,"stargazers_count":2,"open_issues_count":1,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2026-02-27T14:35:46.743Z","etag":null,"topics":["a2a","a2a-server","ag-ui","git","git-cli","mcp-server","python","repository","repository-management"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Knuckles-Team.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2022-08-15T22:09:34.000Z","updated_at":"2026-02-27T13:14:20.000Z","dependencies_parsed_at":"2026-02-18T06:02:06.722Z","dependency_job_id":null,"html_url":"https://github.com/Knuckles-Team/repository-manager","commit_stats":{"total_commits":20,"total_committers":2,"mean_commits":10.0,"dds":0.25,"last_synced_commit":"65137cda886a5f79e8e17849e37e1502faea3828"},"previous_names":[],"tags_count":100,"template":false,"template_full_name":null,"purl":"pkg:github/Knuckles-Team/repository-manager","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Knuckles-Team%2Frepository-manager","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Knuckles-Team%2Frepository-manager/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Knuckles-Team%2Frepository-manager/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Knuckles-Team%2Frepository-manager/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Knuckles-Team","download_url":"https://codeload.github.com/Knuckles-Team/repository-manager/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Knuckles-Team%2Frepository-manager/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":29995910,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-02T01:47:34.672Z","status":"online","status_checked_at":"2026-03-02T02:00:07.342Z","response_time":60,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["a2a","a2a-server","ag-ui","git","git-cli","mcp-server","python","repository","repository-management"],"created_at":"2024-11-19T12:12:06.069Z","updated_at":"2026-03-09T01:01:56.624Z","avatar_url":"https://github.com/Knuckles-Team.png","language":"Python","readme":"# Repository Manager - A2A | AG-UI | MCP\n\n![PyPI - Version](https://img.shields.io/pypi/v/repository-manager)\n![MCP Server](https://badge.mcpx.dev?type=server 'MCP Server')\n![PyPI - Downloads](https://img.shields.io/pypi/dd/repository-manager)\n![GitHub Repo stars](https://img.shields.io/github/stars/Knuckles-Team/repository-manager)\n![GitHub forks](https://img.shields.io/github/forks/Knuckles-Team/repository-manager)\n![GitHub contributors](https://img.shields.io/github/contributors/Knuckles-Team/repository-manager)\n![PyPI - License](https://img.shields.io/pypi/l/repository-manager)\n![GitHub](https://img.shields.io/github/license/Knuckles-Team/repository-manager)\n\n![GitHub last commit (by committer)](https://img.shields.io/github/last-commit/Knuckles-Team/repository-manager)\n![GitHub pull requests](https://img.shields.io/github/issues-pr/Knuckles-Team/repository-manager)\n![GitHub closed pull requests](https://img.shields.io/github/issues-pr-closed/Knuckles-Team/repository-manager)\n![GitHub issues](https://img.shields.io/github/issues/Knuckles-Team/repository-manager)\n\n![GitHub top language](https://img.shields.io/github/languages/top/Knuckles-Team/repository-manager)\n![GitHub language count](https://img.shields.io/github/languages/count/Knuckles-Team/repository-manager)\n![GitHub repo size](https://img.shields.io/github/repo-size/Knuckles-Team/repository-manager)\n![GitHub repo file count (file type)](https://img.shields.io/github/directory-file-count/Knuckles-Team/repository-manager)\n![PyPI - Wheel](https://img.shields.io/pypi/wheel/repository-manager)\n![PyPI - Implementation](https://img.shields.io/pypi/implementation/repository-manager)\n\n*Version: 1.3.32*\n\n## Overview\n\nA Ralph Wiggum inspired coding agent and repository manager!\n\nThis powerful agent can manage your repositories in bulk, implement new features using Ralph Wiggum methodology, run git commands, create and edit code in multiple projects, and query your code base!\n\nRun all Git supported tasks using Git Actions command\n\nRun as an MCP Server for Agentic AI with an A2A/AG-UI/Web Server!\n\n## MCP\n\nAI Prompt:\n```text\nClone all the git projects located in the file \"/home/genius/Development/repositories-list/repositories.txt\" to my \"/home/genius/Development\" workspace.\nAfterwards, pull all the projects located in the \"/home/genius/Development\" repository workspace.\n```\n\nAI Response:\n```text\nAll projects in \"/home/genius/Development/repositories-list/repositories.txt\" have been cloned to \"/home/genius/Development\"\nand all projects in \"/home/genius/Development\" and been pulled from the repositories. Let me know if you need any further actions! 🚀.\n```\n\nThis repository is actively maintained - Contributions are welcome!\n\n## A2A Agent\n\n### Architecture:\n\n```mermaid\n---\nconfig:\n  layout: dagre\n---\nflowchart TB\n subgraph subGraph0[\"Agent Capabilities\"]\n        C[\"Agent\"]\n        B[\"A2A Server - Uvicorn/FastAPI\"]\n        D[\"MCP Tools\"]\n        F[\"Agent Skills\"]\n  end\n    C --\u003e D \u0026 F\n    A[\"User Query\"] --\u003e B\n    B --\u003e C\n    D --\u003e E[\"Platform API\"]\n\n     C:::agent\n     B:::server\n     A:::server\n    classDef server fill:#f9f,stroke:#333\n    classDef agent fill:#bbf,stroke:#333,stroke-width:2px\n    style B stroke:#000000,fill:#FFD600\n    style D stroke:#000000,fill:#BBDEFB\n    style F fill:#BBDEFB\n    style A fill:#C8E6C9\n    style subGraph0 fill:#FFF9C4\n```\n\n### Component Interaction Diagram\n\n```mermaid\nsequenceDiagram\n    participant User\n    participant Server as A2A Server\n    participant Agent as Agent\n    participant Skill as Agent Skills\n    participant MCP as MCP Tools\n\n    User-\u003e\u003eServer: Send Query\n    Server-\u003e\u003eAgent: Invoke Agent\n    Agent-\u003e\u003eSkill: Analyze Skills Available\n    Skill-\u003e\u003eAgent: Provide Guidance on Next Steps\n    Agent-\u003e\u003eMCP: Invoke Tool\n    MCP--\u003e\u003eAgent: Tool Response Returned\n    Agent--\u003e\u003eAgent: Return Results Summarized\n    Agent--\u003e\u003eServer: Final Response\n    Server--\u003e\u003eUser: Output\n```\n\n## Usage\n\n### CLI\n\n| Short Flag | Long Flag        | Description                            |\n|------------|------------------|----------------------------------------|\n| -h         | --help           | See Usage                              |\n| -b         | --default-branch | Checkout default branch                |\n| -c         | --clone          | Clone projects specified               |\n| -w         | --workspace      | Workspace to clone/pull projects       |\n| -f         | --file           | File with repository links             |\n| -p         | --pull           | Pull projects in parent directory      |\n| -r         | --repositories   | Comma separated Git URLs               |\n| -t         | --threads        | Number of parallel threads - Default 4 |\n\n```bash\nrepository-manager \\\n    --clone  \\\n    --pull  \\\n    --workspace '/home/user/Downloads'  \\\n    --file '/home/user/Downloads/repositories.txt'  \\\n    --repositories 'https://github.com/Knucklessg1/media-downloader,https://github.com/Knucklessg1/genius-bot' \\\n    --threads 8\n```\n\n### MCP CLI\n\n| Short Flag | Long Flag                          | Description                                                                 |\n|------------|------------------------------------|-----------------------------------------------------------------------------|\n| -h         | --help                             | Display help information                                                    |\n| -t         | --transport                        | Transport method: 'stdio', 'http', or 'sse' [legacy] (default: stdio)       |\n| -s         | --host                             | Host address for HTTP transport (default: 0.0.0.0)                          |\n| -p         | --port                             | Port number for HTTP transport (default: 8000)                              |\n|            | --auth-type                        | Authentication type: 'none', 'static', 'jwt', 'oauth-proxy', 'oidc-proxy', 'remote-oauth' (default: none) |\n|            | --token-jwks-uri                   | JWKS URI for JWT verification                                              |\n|            | --token-issuer                     | Issuer for JWT verification                                                |\n|            | --token-audience                   | Audience for JWT verification                                              |\n|            | --oauth-upstream-auth-endpoint     | Upstream authorization endpoint for OAuth Proxy                             |\n|            | --oauth-upstream-token-endpoint    | Upstream token endpoint for OAuth Proxy                                    |\n|            | --oauth-upstream-client-id         | Upstream client ID for OAuth Proxy                                         |\n|            | --oauth-upstream-client-secret     | Upstream client secret for OAuth Proxy                                     |\n|            | --oauth-base-url                   | Base URL for OAuth Proxy                                                   |\n|            | --oidc-config-url                  | OIDC configuration URL                                                     |\n|            | --oidc-client-id                   | OIDC client ID                                                             |\n|            | --oidc-client-secret               | OIDC client secret                                                         |\n|            | --oidc-base-url                    | Base URL for OIDC Proxy                                                    |\n|            | --remote-auth-servers              | Comma-separated list of authorization servers for Remote OAuth             |\n|            | --remote-base-url                  | Base URL for Remote OAuth                                                  |\n|            | --allowed-client-redirect-uris     | Comma-separated list of allowed client redirect URIs                       |\n|            | --eunomia-type                     | Eunomia authorization type: 'none', 'embedded', 'remote' (default: none)   |\n|            | --eunomia-policy-file              | Policy file for embedded Eunomia (default: mcp_policies.json)              |\n|            | --eunomia-remote-url               | URL for remote Eunomia server                                              |\n\n\n### A2A CLI\n\n| Short Flag | Long Flag         | Description                                                            |\n|------------|-------------------|------------------------------------------------------------------------|\n| -h         | --help            | Display help information                                               |\n|            | --host            | Host to bind the server to (default: 0.0.0.0)                          |\n|            | --port            | Port to bind the server to (default: 9000)                             |\n|            | --reload          | Enable auto-reload                                                     |\n|            | --provider        | LLM Provider: 'openai', 'anthropic', 'google', 'huggingface'           |\n|            | --model-id        | LLM Model ID (default: qwen3:4b)                                       |\n|            | --base-url        | LLM Base URL (for OpenAI compatible providers)                         |\n|            | --api-key         | LLM API Key                                                            |\n|            | --smart-coding-mcp-enable | Enable Smart Coding MCP configuration                                  |\n|            | --python-sandbox-enable | Enable Python Sandbox MCP configuration                                  |\n|            | --workspace            | Workspace to scan for git projects (default: current directory)       |\n\n\n### Smart Coding MCP Integration\n\nThe Repository Manager A2A Agent can automatically configure `smart-coding-mcp` for any Git projects found in a specified directory.\n\n```bash\nrepository_manager_a2a --smart-coding-mcp-enable --workspace /path/to/my/projects\n```\n\nThis will:\n1. Scan `/path/to/my/projects` for any subdirectories containing a `.git` folder.\n2. Update `mcp_config.json` to include a `smart-coding-mcp` server entry for each found project.\n3. Start the agent with access to these new MCP servers, allowing for semantic code search within your projects.\n\n### Python Sandbox Integration\n\nThe Agent can execute Python code in a secure Deno sandbox using `mcp-run-python`.\n\n```bash\nrepository_manager_a2a --python-sandbox-enable\n```\n\nThis will:\n1.  Configure `mcp_config.json` to include the `python-sandbox` server.\n2.  Enable the `Python Sandbox` skill, allowing the agent to run scripts for calculation, testing, or logic verification.\n\n### Default Repository List\n\nThe agent will automatically load the `repositories-list.txt` file included in the package as the default project list if no `PROJECTS_FILE` environment variable is set. This ensures the agent always has a list of repositories to work with.\n\n\n### Using as an MCP Server\n\nThe MCP Server can be run in two modes: `stdio` (for local testing) or `http` (for networked access). To start the server, use the following commands:\n\n#### Run in stdio mode (default):\n```bash\nrepository-manager-mcp --transport \"stdio\"\n```\n\n#### Run in HTTP mode:\n```bash\nrepository-manager-mcp --transport \"http\"  --host \"0.0.0.0\"  --port \"8000\"\n```\n\n### Use in Python\n\n```python\nfrom repository_manager.repository_manager import Git\n\ngitlab = Git()\n\ngitlab.set_workspace(\"\u003cworkspace\u003e\")\n\ngitlab.set_threads(threads=8)\n\ngitlab.set_git_projects(\"\u003cprojects\u003e\")\n\ngitlab.set_default_branch(set_to_default_branch=True)\n\ngitlab.clone_projects_in_parallel()\n\ngitlab.pull_projects_in_parallel()\n```\n\n\n### Deploy MCP Server as a Service\n\nThe ServiceNow MCP server can be deployed using Docker, with configurable authentication, middleware, and Eunomia authorization.\n\n#### Using Docker Run\n\n```bash\ndocker pull knucklessg1/repository-manager:latest\n\ndocker run -d \\\n  --name repository-manager-mcp \\\n  -p 8004:8004 \\\n  -e HOST=0.0.0.0 \\\n  -e PORT=8004 \\\n  -e TRANSPORT=http \\\n  -e AUTH_TYPE=none \\\n  -e EUNOMIA_TYPE=none \\\n  -v development:/root/Development \\\n  knucklessg1/repository-manager:latest\n```\n\nFor advanced authentication (e.g., JWT, OAuth Proxy, OIDC Proxy, Remote OAuth) or Eunomia, add the relevant environment variables:\n\n```bash\ndocker run -d \\\n  --name repository-manager-mcp \\\n  -p 8004:8004 \\\n  -e HOST=0.0.0.0 \\\n  -e PORT=8004 \\\n  -e TRANSPORT=http \\\n  -e AUTH_TYPE=oidc-proxy \\\n  -e OIDC_CONFIG_URL=https://provider.com/.well-known/openid-configuration \\\n  -e OIDC_CLIENT_ID=your-client-id \\\n  -e OIDC_CLIENT_SECRET=your-client-secret \\\n  -e OIDC_BASE_URL=https://your-server.com \\\n  -e ALLOWED_CLIENT_REDIRECT_URIS=http://localhost:*,https://*.example.com/* \\\n  -e EUNOMIA_TYPE=embedded \\\n  -e EUNOMIA_POLICY_FILE=/app/mcp_policies.json \\\n  -v development:/root/Development \\\n  knucklessg1/repository-manager:latest\n```\n\n#### Using Docker Compose\n\nCreate a `docker-compose.yml` file:\n\n```yaml\nservices:\n  repository-manager-mcp:\n    image: knucklessg1/repository-manager:latest\n    environment:\n      - HOST=0.0.0.0\n      - PORT=8004\n      - TRANSPORT=http\n      - AUTH_TYPE=none\n      - EUNOMIA_TYPE=none\n    volumes:\n      - development:/root/Development\n    ports:\n      - 8004:8004\n```\n\nFor advanced setups with authentication and Eunomia:\n\n```yaml\nservices:\n  repository-manager-mcp:\n    image: knucklessg1/repository-manager:latest\n    environment:\n      - HOST=0.0.0.0\n      - PORT=8004\n      - TRANSPORT=http\n      - AUTH_TYPE=oidc-proxy\n      - OIDC_CONFIG_URL=https://provider.com/.well-known/openid-configuration\n      - OIDC_CLIENT_ID=your-client-id\n      - OIDC_CLIENT_SECRET=your-client-secret\n      - OIDC_BASE_URL=https://your-server.com\n      - ALLOWED_CLIENT_REDIRECT_URIS=http://localhost:*,https://*.example.com/*\n      - EUNOMIA_TYPE=embedded\n      - EUNOMIA_POLICY_FILE=/app/mcp_policies.json\n    ports:\n      - 8004:8004\n    volumes:\n      - development:/root/Development\n      - ./mcp_policies.json:/app/mcp_policies.json\n```\n\nRun the service:\n\n```bash\ndocker-compose up -d\n```\n\n#### Configure `mcp.json` for AI Integration\n\n```json\n{\n  \"mcpServers\": {\n    \"repository_manager\": {\n      \"command\": \"uv\",\n      \"args\": [\n        \"run\",\n        \"--with\",\n        \"repository-manager\",\n        \"repository-manager-mcp\"\n      ],\n      \"env\": {\n        \"REPOSITORY_MANAGER_WORKSPACE\": \"/home/user/Development/\",                       // Optional - Can be specified at prompt\n        \"REPOSITORY_MANAGER_THREADS\": \"12\",                                              // Optional - Can be specified at prompt\n        \"REPOSITORY_MANAGER_DEFAULT_BRANCH\": \"True\",                                     // Optional - Can be specified at prompt\n        \"REPOSITORY_MANAGER_PROJECTS_FILE\": \"/home/user/Development/repositories.txt\"    // Optional - Can be specified at prompt\n      },\n      \"timeout\": 300000\n    }\n  }\n}\n\n```\n\n### A2A\n\n#\n#### Endpoints\n- **Web UI**: `http://localhost:8000/` (if enabled)\n- **A2A**: `http://localhost:8000/a2a` (Discovery: `/a2a/.well-known/agent.json`)\n- **AG-UI**: `http://localhost:8000/ag-ui` (POST)\n\n#### A2A CLI\n\n| Short Flag | Long Flag         | Description                                                            |\n|------------|-------------------|------------------------------------------------------------------------|\n| -h         | --help            | Display help information                                               |\n|            | --host            | Host to bind the server to (default: 0.0.0.0)                          |\n|            | --port            | Port to bind the server to (default: 9000)                             |\n|            | --reload          | Enable auto-reload                                                     |\n|            | --provider        | LLM Provider: 'openai', 'anthropic', 'google', 'huggingface'           |\n|            | --model-id        | LLM Model ID (default: qwen3:4b)                                       |\n|            | --base-url        | LLM Base URL (for OpenAI compatible providers)                         |\n|            | --api-key         | LLM API Key                                                            |\n|            | --api-key         | LLM API Key                                                            |\n| --mcp-url         | MCP Server URL (default: http://localhost:8000/mcp)                    |\n| --web             | Enable Pydantic AI Web UI                                              | False (Env: ENABLE_WEB_UI) |\n\n\n## Install Python Package\n\n```bash\npython -m pip install --upgrade repository-manager\n```\n\nor\n\n```bash\nuv pip install --upgrade repository-manager\n```\n\n\n## Repository Owners\n\n\u003cimg width=\"100%\" height=\"180em\" src=\"https://github-readme-stats.vercel.app/api?username=Knucklessg1\u0026show_icons=true\u0026hide_border=true\u0026\u0026count_private=true\u0026include_all_commits=true\" /\u003e\n\n![GitHub followers](https://img.shields.io/github/followers/Knucklessg1)\n![GitHub User's stars](https://img.shields.io/github/stars/Knucklessg1)\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fknuckles-team%2Frepository-manager","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fknuckles-team%2Frepository-manager","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fknuckles-team%2Frepository-manager/lists"}