{"id":15480994,"url":"https://github.com/PRBonn/MapMOS","last_synced_at":"2025-10-12T01:31:05.391Z","repository":{"id":185821705,"uuid":"674159023","full_name":"PRBonn/MapMOS","owner":"PRBonn","description":"Building Volumetric Beliefs for Dynamic Environments Exploiting Map-Based Moving Object Segmentation (RAL 2023)","archived":false,"fork":false,"pushed_at":"2025-08-26T08:24:46.000Z","size":129,"stargazers_count":169,"open_issues_count":0,"forks_count":10,"subscribers_count":8,"default_branch":"main","last_synced_at":"2025-09-05T08:53:02.618Z","etag":null,"topics":["cloud","deep-learning","map","minkowski-engine","minkowskiengine","mos","moving","object","point","point-cloud","segmentation","static"],"latest_commit_sha":null,"homepage":"https://www.ipb.uni-bonn.de/pdfs/mersch2023ral.pdf","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/PRBonn.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":"CITATION.cff","codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2023-08-03T09:20:46.000Z","updated_at":"2025-08-26T08:22:16.000Z","dependencies_parsed_at":"2024-04-26T08:31:52.529Z","dependency_job_id":"3c4cc611-292e-4a7a-a0c7-6670b5c4a740","html_url":"https://github.com/PRBonn/MapMOS","commit_stats":null,"previous_names":["prbonn/mapmos"],"tags_count":3,"template":false,"template_full_name":null,"purl":"pkg:github/PRBonn/MapMOS","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PRBonn%2FMapMOS","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PRBonn%2FMapMOS/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PRBonn%2FMapMOS/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PRBonn%2FMapMOS/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/PRBonn","download_url":"https://codeload.github.com/PRBonn/MapMOS/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/PRBonn%2FMapMOS/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279009769,"owners_count":26084648,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-11T02:00:06.511Z","response_time":55,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cloud","deep-learning","map","minkowski-engine","minkowskiengine","mos","moving","object","point","point-cloud","segmentation","static"],"created_at":"2024-10-02T05:00:59.956Z","updated_at":"2025-10-12T01:31:05.373Z","avatar_url":"https://github.com/PRBonn.png","language":"Python","readme":"\u003cdiv align=\"center\"\u003e\n  \u003ch1\u003eBuilding Volumetric Beliefs for Dynamic Environments Exploiting Map-Based Moving Object Segmentation\u003c/h1\u003e\n  \u003ca href=\"https://github.com/PRBonn/MapMOS#how-to-use-it\"\u003e\u003cimg src=\"https://img.shields.io/badge/python-3670A0?style=flat-square\u0026logo=python\u0026logoColor=ffdd54\" /\u003e\u003c/a\u003e\n    \u003ca href=\"https://github.com/PRBonn/MapMOS#installation\"\u003e\u003cimg src=\"https://img.shields.io/badge/Linux-FCC624?logo=linux\u0026logoColor=black\" /\u003e\u003c/a\u003e\n    \u003ca href=\"https://www.ipb.uni-bonn.de/pdfs/mersch2023ral.pdf\"\u003e\u003cimg src=\"https://img.shields.io/badge/Paper-pdf-\u003cCOLOR\u003e.svg?style=flat-square\" /\u003e\u003c/a\u003e\n    \u003ca href=\"LICENSE\"\u003e\u003cimg src=\"https://img.shields.io/badge/License-MIT-blue.svg?style=flat-square\" /\u003e\u003c/a\u003e\n\n\u003cp\u003e\n  \u003cimg src=\"https://github.com/PRBonn/MapMOS/assets/38326482/cd594591-8c2c-41f0-8412-cd5d1d2fd7d4\" width=\"500\"/\u003e\n\u003c/p\u003e\n\n\u003cp\u003e\n  \u003ci\u003eOur approach identifies moving objects in the current scan (blue points) and the local map (black points) of the environment and maintains a volumetric belief map representing the dynamic environment.\u003c/i\u003e\n\u003c/p\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eClick here for qualitative results!\u003c/b\u003e\u003c/summary\u003e\n\n[![MapMOS](https://github.com/PRBonn/MapMOS/assets/38326482/a4238431-bd2d-4b2c-991b-7ff5e9378a8e)](https://github.com/PRBonn/MapMOS/assets/38326482/04c7e5a2-dd44-431a-95b0-c42d5605078a)\n\n \u003ci\u003eOur predictions for the KITTI Tracking sequence 19 with true positives (green), false positives (red), and false negatives (blue).\u003c/i\u003e\n\n\u003c/details\u003e\n\n\n\u003c/div\u003e\n\n\n## Installation\nFirst, make sure the [MinkowskiEngine](https://github.com/NVIDIA/MinkowskiEngine) is installed on your system, see [here](https://github.com/NVIDIA/MinkowskiEngine#installation) for more details.\n\nNext, clone our repository\n```bash\ngit clone git@github.com:PRBonn/MapMOS \u0026\u0026 cd MapMOS\n```\n\nand install with\n```bash\nmake install\n```\n\n**or**\n```bash\nmake install-all\n```\nif you want to install the project with all optional dependencies (needed for the visualizer). In case you want to edit the Python code, install in editable mode:\n```bash\nmake editable\n```\n\n## How to Use It\nJust type\n\n```bash\nmapmos_pipeline --help\n```\nto see how to run MapMOS.\n\u003cdetails\u003e\n\u003csummary\u003eThis is what you should see\u003c/summary\u003e\n\n![Screenshot from 2023-08-03 13-07-14](https://github.com/PRBonn/MapMOS/assets/38326482/c769afa6-709d-4648-b42d-11092d5b92ac)\n\n\u003c/details\u003e\n\nCheck the [Download](#downloads) section for a pre-trained model. Like [KISS-ICP](https://github.com/PRBonn/kiss-icp), our pipeline runs on a variety of point cloud data formats like `bin`, `pcd`, `ply`, `xyz`, `rosbags`, and more. To visualize these, just type\n\n```bash\nmapmos_pipeline --visualize /path/to/weights.ckpt /path/to/data\n```\n\n\u003cdetails\u003e\n\u003csummary\u003eWant to evaluate with ground truth labels?\u003c/summary\u003e\n\nBecause these labels come in all shapes, you need to specify a dataloader. This is currently available for SemanticKITTI, NuScenes, HeLiMOS, and our labeled KITTI Tracking sequence 19 and Apollo sequences (see [Downloads](#downloads)).\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003eWant to reproduce the results from the paper?\u003c/summary\u003e\nFor reproducing the results of the paper, you need to pass the corresponding config file. They will make sure that the de-skewing option and the maximum range are set properly. To compare different map fusion strategies from our paper, just pass the `--paper` flag to the `mapmos_pipeline`.\n\n\u003c/details\u003e\n\n\n## Training\nTo train our approach, you need to first cache your data. To see how to do that, just `cd` into the `MapMOS` repository and type\n\n```bash\npython3 scripts/precache.py --help\n```\n\nAfter this, you can run the training script. Again, `--help` shows you how:\n```bash\npython3 scripts/train.py --help\n```\n\n\u003cdetails\u003e\n\u003csummary\u003eWant to verify the cached data?\u003c/summary\u003e\n\nYou can inspect the cached training samples by using the script `python3 scripts/cache_to_ply.py --help`.\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003eWant to change the logging directory?\u003c/summary\u003e\n\nThe training log and checkpoints will be saved by default to the current working directory. To change that, export the `export LOGS=/your/path/to/logs` environment variable before running the training script.\n\n\u003c/details\u003e\n\n## HeLiMOS\nWe provide additional training and evaluation data for different sensor types in our [HeLiMOS paper](https://www.ipb.uni-bonn.de/pdfs/lim2024iros.pdf). To train on the HeLiMOS data, use the following commands:\n\n```shell\npython3 scripts/precache.py /path/to/HeLiMOS helimos /path/to/cache --config config/helimos/*_training.yaml\npython3 scripts/train.py /path/to/HeLiMOS helimos /path/to/cache --config config/helimos/*_training.yaml\n```\n\nby replacing the paths and the config file names. To evaluate for example on the Velodyne test data, run\n\n```shell\nmapmos_pipeline /path/to/weights.ckpt /path/to/HeLiMOS --dataloader helimos -s Velodyne/test.txt --config config/helimos/inference.yaml\n```\n\nNote that our sequence `-s` encodes both the sensor type `Velodyne` and split `test.txt`, just replace these with `Ouster`, `Aeva`, or `Avia` and/or `train.txt` or `val.txt` to run MapMOS on different sensors and/or splits.\n\n## Downloads\nYou can download the post-processed and labeled [Apollo dataset](https://www.ipb.uni-bonn.de/html/projects/apollo_dataset/LiDAR-MOS.zip) and [KITTI Tracking sequence 19](https://www.ipb.uni-bonn.de/html/projects/kitti-tracking/post-processed/kitti-tracking.zip) from our website.\n\nThe [weights](https://www.ipb.uni-bonn.de/html/projects/MapMOS/mapmos.ckpt) of our pre-trained model can be downloaded as well.\n\n## Publication\nIf you use our code in your academic work, please cite the corresponding [paper](https://www.ipb.uni-bonn.de/pdfs/mersch2023ral.pdf):\n\n```bibtex\n@article{mersch2023ral,\n  author = {B. Mersch and T. Guadagnino and X. Chen and I. Vizzo and J. Behley and C. Stachniss},\n  title = {{Building Volumetric Beliefs for Dynamic Environments Exploiting Map-Based Moving Object Segmentation}},\n  journal = {IEEE Robotics and Automation Letters (RA-L)},\n  volume = {8},\n  number = {8},\n  pages = {5180--5187},\n  year = {2023},\n  issn = {2377-3766},\n  doi = {10.1109/LRA.2023.3292583},\n  codeurl = {https://github.com/PRBonn/MapMOS},\n}\n```\n\n## Acknowledgments\nThis implementation is heavily inspired by [KISS-ICP](https://github.com/PRBonn/kiss-icp).\n\n## License\nThis project is free software made available under the MIT License. For details see the LICENSE file.\n","funding_links":[],"categories":["Python"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPRBonn%2FMapMOS","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FPRBonn%2FMapMOS","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FPRBonn%2FMapMOS/lists"}