{"id":45103060,"url":"https://github.com/OthmaneBlial/lightclaw","last_synced_at":"2026-03-04T17:00:56.377Z","repository":{"id":338908431,"uuid":"1159540400","full_name":"OthmaneBlial/lightclaw","owner":"OthmaneBlial","description":"🦞 LightClaw: The Featherweight Core of OpenClaw — Your AI Agent in a Tiny Codebase — A lightweight OpenClaw alternative","archived":false,"fork":false,"pushed_at":"2026-03-01T15:52:49.000Z","size":5481,"stargazers_count":14,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2026-03-01T18:40:05.166Z","etag":null,"topics":["agentic-ai","ai","alternative","chatbot","claude-code","deepseek","lightweight","llm","openai","openclaw","openclaw-plugin","openclaw-skill","openclaw-skills","python","rag","rag-chatbot","xai","z-ai"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/OthmaneBlial.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2026-02-16T21:13:58.000Z","updated_at":"2026-03-01T15:52:53.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/OthmaneBlial/lightclaw","commit_stats":null,"previous_names":["othmaneblial/lightclaw"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/OthmaneBlial/lightclaw","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/OthmaneBlial%2Flightclaw","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/OthmaneBlial%2Flightclaw/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/OthmaneBlial%2Flightclaw/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/OthmaneBlial%2Flightclaw/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/OthmaneBlial","download_url":"https://codeload.github.com/OthmaneBlial/lightclaw/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/OthmaneBlial%2Flightclaw/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":30086511,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-03-04T15:40:14.053Z","status":"ssl_error","status_checked_at":"2026-03-04T15:40:13.655Z","response_time":59,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["agentic-ai","ai","alternative","chatbot","claude-code","deepseek","lightweight","llm","openai","openclaw","openclaw-plugin","openclaw-skill","openclaw-skills","python","rag","rag-chatbot","xai","z-ai"],"created_at":"2026-02-19T21:01:56.199Z","updated_at":"2026-03-04T17:00:56.296Z","avatar_url":"https://github.com/OthmaneBlial.png","language":"Python","funding_links":[],"categories":["Community Projects"],"sub_categories":["Deployment \u0026 Infrastructure"],"readme":"# LightClaw\n\nLightClaw is a **self-hosted Telegram AI agent** inspired by OpenClaw: a small Python codebase with long-term memory, multi-provider LLM routing, skills, and local multi-agent delegation.\n\nIf you are searching for an **OpenClaw alternative**, **OpenClaw in Python**, or a **Telegram AI bot with memory**, this project is built for that workflow.\n\n\u003cdiv align=\"center\"\u003e\n  \u003cimg src=\"logo.png\" alt=\"LightClaw logo\" width=\"420\"\u003e\n\u003c/div\u003e\n\n## Security Disclaimer\n\nLightClaw can execute impactful actions (file edits and delegated local agent runs).  \nUse least-privilege credentials, review installed skills, and restrict bot access with `TELEGRAM_ALLOWED_USERS`.\n\n## Why LightClaw\n\n- Lightweight and forkable: understand the core quickly and customize without framework overhead.\n- Practical for solo builders: run on small VPS machines with minimal setup.\n- Built for real usage: memory recall, file operations, skills, and delegated coding agents.\n\n## Core Features\n\n- Infinite memory with SQLite + semantic recall.\n- 6 LLM providers: OpenAI, xAI, Anthropic, Gemini, DeepSeek, Z-AI.\n- Telegram-first experience with command-driven workflow.\n- Local terminal chat mode (`lightclaw chat`) using the same runtime stack.\n- Skills system (hub + local skills).\n- Local agent delegation (`codex`, `claude`) for large coding tasks.\n- Smart multi-agent orchestration with auto-planning, dependencies, and confirmation flow.\n- Workspace-native code generation/editing with compact delta reports.\n- Optional voice transcription with Groq Whisper.\n\n## Quick Start\n\n### 1) One-command setup (recommended)\n\n```bash\ngit clone https://github.com/OthmaneBlial/lightclaw.git \u0026\u0026 cd lightclaw \u0026\u0026 bash setup.sh\n```\n\n`setup.sh` does everything automatically:\n\n- Installs the `lightclaw` command at `~/.local/bin/lightclaw`\n- Writes your config to `~/.env`\n- Creates runtime files in `~/.lightclaw`\n\nThen run:\n\n```bash\nlightclaw run\n```\n\nIf your shell has not reloaded `PATH` yet, use:\n\n```bash\n~/.local/bin/lightclaw run\n```\n\n### 2) Manual setup\n\n```bash\ngit clone https://github.com/OthmaneBlial/lightclaw.git\ncd lightclaw\npip install -r requirements.txt\n./lightclaw onboard\n```\n\nThen edit `~/.env` and start:\n\n```bash\n./lightclaw run\n```\n\n## Minimal `.env` Example\n\n```env\n# Provider selection\nLLM_PROVIDER=openai\nLLM_MODEL=latest\n\n# Provider keys (fill what you use)\nOPENAI_API_KEY=\nANTHROPIC_API_KEY=\nANTHROPIC_AUTH_TOKEN=\nANTHROPIC_BASE_URL=\nDEEPSEEK_API_KEY=\n\n# Telegram\nTELEGRAM_BOT_TOKEN=\nTELEGRAM_ALLOWED_USERS=\n\n# Optional generation tuning\nMAX_OUTPUT_TOKENS=12000\n\n# Local delegated agents\nLOCAL_AGENT_TIMEOUT_SEC=1800\nLOCAL_AGENT_PROGRESS_INTERVAL_SEC=30\nLOCAL_AGENT_MULTI_DEFAULT_AGENTS=claude,codex\nLOCAL_AGENT_MULTI_AUTO_CONTINUE=no\nLOCAL_AGENT_SAFETY_MODE=off\nLOCAL_AGENT_DENY_PATTERNS=\n\n# Skills\nSKILLS_HUB_BASE_URL=https://clawhub.ai\nSKILLS_STATE_PATH=.lightclaw/skills_state.json\n```\n\n## CLI Commands\n\n```bash\nlightclaw onboard\nlightclaw onboard --reset-env\nlightclaw onboard --configure\nlightclaw run\nlightclaw run --provider deepseek --model deepseek-chat\nlightclaw chat\n```\n\n## Telegram / Chat Commands\n\n| Command | Purpose |\n|---|---|\n| `/help` | Show command help |\n| `/memory` | Memory stats |\n| `/recall \u003cquery\u003e` | Semantic memory search |\n| `/skills ...` | Search/install/activate skills |\n| `/agent` | Local agent delegation controls |\n| `/agent doctor` | Agent install/auth diagnostics |\n| `/agent multi \u003cgoal\u003e` | Auto-plan multi-agent run |\n| `/agent multi @claude @codex \u003cgoal\u003e` | Prefer specific agents |\n| `/agent multi --agent backend=codex --agent qa=claude \u003cgoal\u003e` | Explicit worker roster |\n| `/agent multi confirm` | Execute pending plan |\n| `/agent multi edit \u003cfeedback\u003e` | Regenerate pending plan |\n| `/agent multi cancel` | Cancel pending plan |\n| `/show` | Current runtime/provider/model status |\n| `/clear` | Reset current chat history |\n| `/wipe_memory` | Wipe all saved memory (confirmation required) |\n\n## Smart Multi-Agent Mode\n\n`/agent multi` supports three ways to define worker assignment:\n\n1. Auto mode:\n\n```text\n/agent multi build a full stack todo app\n```\n\n2. Preferred agents (no labels):\n\n```text\n/agent multi @claude @codex build a full stack todo app\n```\n\n3. Explicit roster override (backward compatible):\n\n```text\n/agent multi --agent backend=codex --agent frontend=claude --agent docs=codex build a full stack todo app\n```\n\nHow it runs:\n\n- Plan is generated and shown first.\n- Confirmation is required by default (`confirm`, `yes`) unless `LOCAL_AGENT_MULTI_AUTO_CONTINUE=yes`.\n- `edit` lets you iterate the plan before execution.\n- `cancel` or `no` clears the pending plan.\n- Execution follows dependency phases so independent workers run in parallel.\n\n## Supported Providers\n\n| Provider | Set `LLM_PROVIDER` | Example Models |\n|---|---|---|\n| OpenAI | `openai` | `gpt-5.2`, `gpt-5.2-mini` |\n| xAI | `xai` | `grok-4-latest` |\n| Claude | `claude` | `claude-opus-4-5`, `claude-sonnet-4-5` |\n| Gemini | `gemini` | `gemini-3-flash-preview`, `gemini-2.5-flash` |\n| DeepSeek | `deepseek` | `deepseek-chat`, `deepseek-reasoner` |\n| Z-AI | `zai` | `glm-5`, `glm-4.7` |\n\nQuick provider check:\n\n```bash\npython scripts/provider_smoke_test.py\n```\n\n## Skills (Hub + Local)\n\nExamples:\n\n```text\n/skills search sonos\n/skills add sonoscli\n/skills use sonoscli\n/skills off sonoscli\n/skills create my_custom_skill \"My private workflow\"\n```\n\nPaths:\n\n- Hub skills: `~/.lightclaw/skills/hub/\u003cslug\u003e/SKILL.md`\n- Local skills: `~/.lightclaw/skills/local/\u003cname\u003e/SKILL.md`\n\n## Architecture (Short)\n\n```text\nTelegram or terminal chat\n  -\u003e memory recall (SQLite + semantic search)\n  -\u003e provider routing (OpenAI/xAI/Claude/Gemini/DeepSeek/Z-AI)\n  -\u003e response + optional file operations in ~/.lightclaw/workspace\n  -\u003e optional delegated local agents (single or multi-worker)\n```\n\n## OpenClaw and LightClaw\n\n- OpenClaw: larger TypeScript platform for broad, multi-app orchestration.\n- LightClaw: focused Python core for fast local customization and Telegram-first workflows.\n\nOpenClaw links:\n\n- https://github.com/openclaw/openclaw\n- https://docs.openclaw.ai/\n\n## Requirements\n\n- Python 3.10+\n- Telegram bot token from [@BotFather](https://t.me/BotFather)\n- API credentials for at least one supported LLM provider\n- Optional: Groq API key for voice transcription\n\n## License\n\nMIT\n\n---\n\nLightClaw is intentionally small: easy to read, easy to fork, and fast to ship.\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FOthmaneBlial%2Flightclaw","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FOthmaneBlial%2Flightclaw","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FOthmaneBlial%2Flightclaw/lists"}