{"id":47950361,"url":"https://github.com/dimensionalOS/dimos","last_synced_at":"2026-04-19T16:01:04.231Z","repository":{"id":334284778,"uuid":"875378461","full_name":"dimensionalOS/dimos","owner":"dimensionalOS","description":"Dimensional is the agentic operating system for physical space. Vibecode humanoids, quadrupeds, drones, and other hardware platforms in natural language and build multi-agent systems that work seamlessly with physical input (cameras, lidar, actuators).","archived":false,"fork":false,"pushed_at":"2026-04-18T21:47:19.000Z","size":83717,"stargazers_count":3002,"open_issues_count":342,"forks_count":495,"subscribers_count":31,"default_branch":"main","last_synced_at":"2026-04-18T22:27:27.575Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://dimensionalos.com/","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/dimensionalOS.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":"AGENTS.md","dco":null,"cla":"CLA.md"}},"created_at":"2024-10-19T20:13:16.000Z","updated_at":"2026-04-18T22:06:33.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/dimensionalOS/dimos","commit_stats":null,"previous_names":["dimensionalos/dimos"],"tags_count":11,"template":false,"template_full_name":null,"purl":"pkg:github/dimensionalOS/dimos","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dimensionalOS%2Fdimos","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dimensionalOS%2Fdimos/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dimensionalOS%2Fdimos/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dimensionalOS%2Fdimos/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/dimensionalOS","download_url":"https://codeload.github.com/dimensionalOS/dimos/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dimensionalOS%2Fdimos/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":32012787,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-18T20:23:30.271Z","status":"online","status_checked_at":"2026-04-19T02:00:07.110Z","response_time":55,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2026-04-04T09:00:30.367Z","updated_at":"2026-04-19T16:01:04.224Z","avatar_url":"https://github.com/dimensionalOS.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n\n\u003cimg width=\"1000\" alt=\"banner_bordered_trimmed\" src=\"https://github.com/user-attachments/assets/64f13b39-da06-4f58-add0-cfc44f04db4e\" /\u003e\n\n\u003ch2\u003eThe Agentive Operating System for Physical Space\u003c/h2\u003e\n\n[![Discord](https://img.shields.io/discord/1341146487186391173?style=flat-square\u0026logo=discord\u0026logoColor=white\u0026label=Discord\u0026color=5865F2)](https://discord.gg/dimos)\n[![Stars](https://img.shields.io/github/stars/dimensionalOS/dimos?style=flat-square)](https://github.com/dimensionalOS/dimos/stargazers)\n[![Forks](https://img.shields.io/github/forks/dimensionalOS/dimos?style=flat-square)](https://github.com/dimensionalOS/dimos/fork)\n[![Contributors](https://img.shields.io/github/contributors/dimensionalOS/dimos?style=flat-square)](https://github.com/dimensionalOS/dimos/graphs/contributors)\n![Nix](https://img.shields.io/badge/Nix-flakes-5277C3?style=flat-square\u0026logo=NixOS\u0026logoColor=white)\n![NixOS](https://img.shields.io/badge/NixOS-supported-5277C3?style=flat-square\u0026logo=NixOS\u0026logoColor=white)\n![CUDA](https://img.shields.io/badge/CUDA-supported-76B900?style=flat-square\u0026logo=nvidia\u0026logoColor=white)\n[![Docker](https://img.shields.io/badge/Docker-ready-2496ED?style=flat-square\u0026logo=docker\u0026logoColor=white)](https://www.docker.com/)\n\n\u003ca href=\"https://trendshift.io/repositories/23169\" target=\"_blank\"\u003e\u003cimg src=\"https://trendshift.io/api/badge/repositories/23169\" alt=\"dimensionalOS%2Fdimos | Trendshift\" style=\"width: 250px; height: 55px;\" width=\"250\" height=\"55\"/\u003e\u003c/a\u003e\n\n\u003cbig\u003e\u003cbig\u003e\n\n[Hardware](#hardware) •\n[Installation](#installation) •\n[Agent CLI \u0026 MCP](#agent-cli-and-mcp) •\n[Blueprints](#blueprints) •\n[Development](#development)\n\n⚠️ **Pre-Release Beta** ⚠️\n\n\u003c/big\u003e\u003c/big\u003e\n\n\u003c/div\u003e\n\n# Intro\n\nDimensional is the modern operating system for generalist robotics. We are setting the next-generation SDK standard, integrating with the majority of robot manufacturers.\n\nWith a simple install and no ROS required, build physical applications entirely in python that run on any humanoid, quadruped, or drone.\n\nDimensional is agent native -- \"vibecode\" your robots in natural language and build (local \u0026 hosted) multi-agent systems that work seamlessly with your hardware. Agents run as native modules — subscribing to any embedded stream, from perception (lidar, camera) and spatial memory down to control loops and motor drivers.\n\u003ctable\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003ca href=\"docs/capabilities/navigation/native/index.md\"\u003e\u003cimg src=\"assets/readme/navigation.gif\" alt=\"Navigation\" width=\"100%\"\u003e\u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003cimg src=\"assets/readme/perception.png\" alt=\"Perception\" width=\"100%\"\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003ch3\u003e\u003ca href=\"docs/capabilities/navigation/native/index.md\"\u003eNavigation and Mapping\u003c/a\u003e\u003c/h3\u003e\n      SLAM, dynamic obstacle avoidance, route planning, and autonomous exploration — via both DimOS native and ROS\u003cbr\u003e\u003ca href=\"https://x.com/stash_pomichter/status/2010471593806545367\"\u003eWatch video\u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003ch3\u003ePerception\u003c/h3\u003e\n      Detectors, 3d projections, VLMs, Audio processing\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003ca href=\"docs/capabilities/agents/readme.md\"\u003e\u003cimg src=\"assets/readme/agentic_control.gif\" alt=\"Agents\" width=\"100%\"\u003e\u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003cimg src=\"assets/readme/spatial_memory.gif\" alt=\"Spatial Memory\" width=\"100%\"\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003ch3\u003e\u003ca href=\"docs/capabilities/agents/readme.md\"\u003eAgentive Control, MCP\u003c/a\u003e\u003c/h3\u003e\n      \"hey Robot, go find the kitchen\"\u003cbr\u003e\u003ca href=\"https://x.com/stash_pomichter/status/2015912688854200322\"\u003eWatch video\u003c/a\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"50%\"\u003e\n      \u003ch3\u003eSpatial Memory\u003c/a\u003e\u003c/h3\u003e\n      Spatio-temporal RAG, Dynamic memory, Object localization and permanence\u003cbr\u003e\u003ca href=\"https://x.com/stash_pomichter/status/1980741077205414328\"\u003eWatch video\u003c/a\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n\u003c/table\u003e\n\n\n# Hardware\n\n\u003ctable\u003e\n  \u003ctr\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      \u003ch3\u003eQuadruped\u003c/h3\u003e\n      \u003cimg width=\"245\" height=\"1\" src=\"assets/readme/spacer.png\"\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      \u003ch3\u003eHumanoid\u003c/h3\u003e\n      \u003cimg width=\"245\" height=\"1\" src=\"assets/readme/spacer.png\"\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      \u003ch3\u003eArm\u003c/h3\u003e\n      \u003cimg width=\"245\" height=\"1\" src=\"assets/readme/spacer.png\"\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      \u003ch3\u003eDrone\u003c/h3\u003e\n      \u003cimg width=\"245\" height=\"1\" src=\"assets/readme/spacer.png\"\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      \u003ch3\u003eMisc\u003c/h3\u003e\n      \u003cimg width=\"245\" height=\"1\" src=\"assets/readme/spacer.png\"\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n\n  \u003ctr\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      🟩 \u003ca href=\"docs/platforms/quadruped/go2/index.md\"\u003eUnitree Go2 pro/air\u003c/a\u003e\u003cbr\u003e\n      🟥 \u003ca href=\"dimos/robot/unitree/b1\"\u003eUnitree B1\u003c/a\u003e\u003cbr\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      🟨 \u003ca href=\"docs/platforms/humanoid/g1/index.md\"\u003eUnitree G1\u003c/a\u003e\u003cbr\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      🟨 \u003ca href=\"docs/capabilities/manipulation/readme.md\"\u003eXarm\u003c/a\u003e\u003cbr\u003e\n      🟨 \u003ca href=\"docs/capabilities/manipulation/readme.md\"\u003eAgileX Piper\u003c/a\u003e\u003cbr\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      🟧 \u003ca href=\"dimos/robot/drone/README.md\"\u003eMAVLink\u003c/a\u003e\u003cbr\u003e\n      🟧 \u003ca href=\"dimos/robot/drone/README.md\"\u003eDJI Mavic\u003c/a\u003e\u003cbr\u003e\n    \u003c/td\u003e\n    \u003ctd align=\"center\" width=\"20%\"\u003e\n      🟥 \u003ca href=\"https://github.com/dimensionalOS/openFT-sensor\"\u003eForce Torque Sensor\u003c/a\u003e\u003cbr\u003e\n    \u003c/td\u003e\n  \u003c/tr\u003e\n\u003c/table\u003e\n\u003cbr\u003e\n\u003cdiv align=\"right\"\u003e\n🟩 stable 🟨 beta 🟧 alpha 🟥 experimental\n\n\u003c/div\u003e\n\n\u003e [!IMPORTANT]\n\u003e 🤖 Direct your favorite Agent (OpenClaw, Claude Code, etc.) to [AGENTS.md](AGENTS.md) and our [CLI and MCP](#agent-cli-and-mcp) interfaces to start building powerful Dimensional applications.\n\n# Installation\n\n## Interactive Install\n\n```sh\ncurl -fsSL https://raw.githubusercontent.com/dimensionalOS/dimos/main/scripts/install.sh | bash\n```\n\n\u003e See [`scripts/install.sh --help`](scripts/install.sh) for non-interactive and advanced options.\n\n## Manual System Install\n\nTo set up your system dependencies, follow one of these guides:\n\n- 🟩 [Ubuntu 22.04 / 24.04](docs/installation/ubuntu.md)\n- 🟩 [NixOS / General Linux](docs/installation/nix.md)\n- 🟧 [macOS](docs/installation/osx.md)\n\n\u003e Full system requirements, tested configs, and dependency tiers: [docs/requirements.md](docs/requirements.md)\n\n## Python Install\n\n### Quickstart\n\n```bash\nuv venv --python \"3.12\"\nsource .venv/bin/activate\nuv pip install 'dimos[base,unitree]'\n\n# Replay a recorded quadruped session (no hardware needed)\n# NOTE: First run will show a black rerun window while ~75 MB downloads from LFS\ndimos --replay run unitree-go2\n```\n\n```bash\n# Install with simulation support\nuv pip install 'dimos[base,unitree,sim]'\n\n# Run quadruped in MuJoCo simulation\ndimos --simulation run unitree-go2\n\n# Run humanoid in simulation\ndimos --simulation run unitree-g1-sim\n```\n\n```bash\n# Control a real robot (Unitree quadruped over WebRTC)\nexport ROBOT_IP=\u003cYOUR_ROBOT_IP\u003e\ndimos run unitree-go2\n```\n\n# Featured Runfiles\n\n| Run command | What it does |\n|-------------|-------------|\n| `dimos --replay run unitree-go2` | Quadruped navigation replay — SLAM, costmap, A* planning |\n| `dimos --replay --replay-dir unitree_go2_office_walk2 run unitree-go2-temporal-memory` | Quadruped temporal memory replay |\n| `dimos --simulation run unitree-go2-agentic-mcp` | Quadruped agentic + MCP server in simulation |\n| `dimos --simulation run unitree-g1` | Humanoid in MuJoCo simulation |\n| `dimos --replay run drone-basic` | Drone video + telemetry replay |\n| `dimos --replay run drone-agentic` | Drone + LLM agent with flight skills (replay) |\n| `dimos run demo-camera` | Webcam demo — no hardware needed |\n| `dimos run keyboard-teleop-xarm7` | Keyboard teleop with mock xArm7 (requires `dimos[manipulation]` extra) |\n| `dimos --simulation run unitree-go2-agentic-ollama` | Quadruped agentic with local LLM (requires [Ollama](https://ollama.com) + `ollama serve`) |\n\n\u003e Full blueprint docs: [docs/usage/blueprints.md](docs/usage/blueprints.md)\n\n# Agent CLI and MCP\n\nThe `dimos` CLI manages the full lifecycle — run blueprints, inspect state, interact with agents, and call skills via MCP.\n\n```bash\ndimos run unitree-go2-agentic-mcp --daemon   # Start in background\ndimos status                              # Check what's running\ndimos log -f                              # Follow logs\ndimos agent-send \"explore the room\"       # Send agent a command\ndimos mcp list-tools                      # List available MCP skills\ndimos mcp call relative_move --arg forward=0.5  # Call a skill directly\ndimos stop                                # Shut down\n```\n\n\u003e Full CLI reference: [docs/usage/cli.md](docs/usage/cli.md)\n\n\n# Usage\n\n## Use DimOS as a Library\n\nSee below a simple robot connection module that sends streams of continuous `cmd_vel` to the robot and receives `color_image` to a simple `Listener` module. DimOS Modules are subsystems on a robot that communicate with other modules using standardized messages.\n\n```py\nimport threading, time, numpy as np\nfrom dimos.core.blueprints import autoconnect\nfrom dimos.core.core import rpc\nfrom dimos.core.module import Module\nfrom dimos.core.stream import In, Out\nfrom dimos.msgs.geometry_msgs import Twist\nfrom dimos.msgs.sensor_msgs import Image, ImageFormat\n\nclass RobotConnection(Module):\n    cmd_vel: In[Twist]\n    color_image: Out[Image]\n\n    @rpc\n    def start(self):\n        threading.Thread(target=self._image_loop, daemon=True).start()\n\n    def _image_loop(self):\n        while True:\n            img = Image.from_numpy(\n                np.zeros((120, 160, 3), np.uint8),\n                format=ImageFormat.RGB,\n                frame_id=\"camera_optical\",\n            )\n            self.color_image.publish(img)\n            time.sleep(0.2)\n\nclass Listener(Module):\n    color_image: In[Image]\n\n    @rpc\n    def start(self):\n        self.color_image.subscribe(lambda img: print(f\"image {img.width}x{img.height}\"))\n\nif __name__ == \"__main__\":\n    autoconnect(\n        RobotConnection.blueprint(),\n        Listener.blueprint(),\n    ).build().loop()\n```\n\n## Blueprints\n\nBlueprints are instructions for how to construct and wire modules. We compose them with\n`autoconnect(...)`, which connects streams by `(name, type)` and returns a `Blueprint`.\n\nBlueprints can be composed, remapped, and have transports overridden if `autoconnect()` fails due to conflicting variable names or `In[]` and `Out[]` message types.\n\nA blueprint example that connects the image stream from a robot to an LLM Agent for reasoning and action execution.\n```py\nfrom dimos.core.blueprints import autoconnect\nfrom dimos.core.transport import LCMTransport\nfrom dimos.msgs.sensor_msgs import Image\nfrom dimos.robot.unitree.go2.connection import go2_connection\nfrom dimos.agents.agent import agent\n\nblueprint = autoconnect(\n    go2_connection(),\n    agent(),\n).transports({(\"color_image\", Image): LCMTransport(\"/color_image\", Image)})\n\n# Run the blueprint\nif __name__ == \"__main__\":\n    blueprint.build().loop()\n```\n\n## Library API\n\n- [Modules](docs/usage/modules.md)\n- [LCM](docs/usage/lcm.md)\n- [Blueprints](docs/usage/blueprints.md)\n- [Transports](docs/usage/transports/index.md) — LCM, SHM, DDS, ROS 2\n- [Data Streams](docs/usage/data_streams/README.md)\n- [Configuration](docs/usage/configuration.md)\n- [Visualization](docs/usage/visualization.md)\n\n## Demos\n\n\u003cimg src=\"assets/readme/dimos_demo.gif\" alt=\"DimOS Demo\" width=\"100%\"\u003e\n\n# Development\n\n## Develop on DimOS\n\n```sh\nexport GIT_LFS_SKIP_SMUDGE=1\ngit clone -b dev https://github.com/dimensionalOS/dimos.git\ncd dimos\n\nuv sync --all-extras --no-extra dds\n\n# Run fast test suite\nuv run pytest dimos\n```\n\n\n## Multi Language Support\n\nPython is our glue and prototyping language, but we support many languages via LCM interop.\n\nCheck our language interop examples:\n- [C++](examples/language-interop/cpp/)\n- [Lua](examples/language-interop/lua/)\n- [TypeScript](examples/language-interop/ts/)\n","funding_links":[],"categories":["🌐 Web Development - Frontend","Frameworks \u0026 Libraries"],"sub_categories":["Multi-Agent Orchestration"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FdimensionalOS%2Fdimos","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FdimensionalOS%2Fdimos","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FdimensionalOS%2Fdimos/lists"}