{"id":19279515,"url":"https://github.com/nghorbani/moshpp","last_synced_at":"2025-04-22T00:32:49.203Z","repository":{"id":37737282,"uuid":"411734043","full_name":"nghorbani/moshpp","owner":"nghorbani","description":"Motion and Shape Capture from Sparse Markers","archived":false,"fork":false,"pushed_at":"2022-03-23T18:33:27.000Z","size":4285,"stargazers_count":220,"open_issues_count":4,"forks_count":34,"subscribers_count":7,"default_branch":"main","last_synced_at":"2025-04-01T17:24:03.691Z","etag":null,"topics":["computer-vision","marker-based","mocap","solving","vicon"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/nghorbani.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2021-09-29T15:42:16.000Z","updated_at":"2025-03-23T08:20:16.000Z","dependencies_parsed_at":"2022-08-10T10:37:57.842Z","dependency_job_id":null,"html_url":"https://github.com/nghorbani/moshpp","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nghorbani%2Fmoshpp","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nghorbani%2Fmoshpp/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nghorbani%2Fmoshpp/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/nghorbani%2Fmoshpp/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/nghorbani","download_url":"https://codeload.github.com/nghorbani/moshpp/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250158021,"owners_count":21384334,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["computer-vision","marker-based","mocap","solving","vicon"],"created_at":"2024-11-09T21:15:28.193Z","updated_at":"2025-04-22T00:32:48.879Z","avatar_url":"https://github.com/nghorbani.png","language":"Python","readme":"# MoSh++\n\nThis repository contains the official chumpy implementation of mocap body solver used for AMASS:\n\nAMASS: Archive of Motion Capture as Surface Shapes\\\nNaureen Mahmood, Nima Ghorbani, Nikolaus F. Troje, Gerard Pons-Moll, Michael J. Black\\\n[Full paper](http://files.is.tue.mpg.de/black/papers/amass.pdf) | \n[Video](https://www.youtube.com/watch?v=cceRrlnTCEs\u0026ab_channel=MichaelBlack) | \n[Project website](https://amass.is.tue.mpg.de/) | \n[Poster](http://files.is.tue.mpg.de/black/papers/amass_iccv_poster.pdf)\n\n## Description\n\nThis repository holds the code for MoSh++, introduced in [AMASS](http://amass.is.tue.mpg.de/), ICCV'19.\nMoSh++ is the upgraded version of [MoSh](https://ps.is.mpg.de/publications/loper-sigasia-2014), Sig.Asia'2014.\nGiven a *labeled* marker-based motion capture (mocap) c3d file and the *correspondences* \nof the marker labels to the locations on the body, MoSh can\nreturn model parameters for every frame of the mocap sequence. \nThe current MoSh++ code works with the following models:\n\n- [SMPL](https://smpl.is.tue.mpg.de/)\n- [SMPL+H](http://mano.is.tue.mpg.de/)\n- [SMPL-X](https://smpl-x.is.tue.mpg.de/)\n- [MANO](http://mano.is.tue.mpg.de/)\n- [Objects](https://grab.is.tue.mpg.de/)\n- [SMALL](https://smal.is.tue.mpg.de/)\n\n## Installation\n\n\nThe Current repository requires Python 3.7 and chumpy; a CPU based auto-differentiation package.\nThis package is assumed to be used along with [SOMA](https://github.com/nghorbani/soma), the mocap auto-labeling package.\nPlease install MoSh++ inside the conda environment of SOMA.\nClone the moshpp repository, and run the following from the root directory:\n\n```\nsudo apt install libtbb-dev\n\npip install -r requirements.txt\n\ncd src/moshpp/scan2mesh\nsudo apt install libeigen3-dev\npip install -r requirements.txt\n2. sudo apt install libtbb-dev\ncd mesh_distance\nmake\n\ncd ../../../..\npython setup.py install\n```\n\n## Tutorials\nThis repository is a complementary package to [SOMA](https://soma.is.tue.mpg.de/), an automatic mocap solver.\nPlease refer to the [SOMA repository](https://github.com/nghorbani/soma) for tutorials and use cases.\n\n## Citation\n\nPlease cite the following paper if you use this code directly or indirectly in your research/projects:\n\n```\n@inproceedings{AMASS:2019,\n  title={AMASS: Archive of Motion Capture as Surface Shapes},\n  author={Mahmood, Naureen and Ghorbani, Nima and Troje, Nikolaus F. and Pons-Moll, Gerard and Black, Michael J.},\n  booktitle = {The IEEE International Conference on Computer Vision (ICCV)},\n  year={2019},\n  month = {Oct},\n  url = {https://amass.is.tue.mpg.de},\n  month_numeric = {10}\n}\n```\n\nPlease consider citing the initial version of MoSh from Loper et al. Sig. Asia'14:\n\n```\n   @article{Loper:SIGASIA:2014,\n     title = {{MoSh}: Motion and Shape Capture from Sparse Markers},\n     author = {Loper, Matthew M. and Mahmood, Naureen and Black, Michael J.},\n     address = {New York, NY, USA},\n     publisher = {ACM},\n     month = nov,\n     number = {6},\n     volume = {33},\n     pages = {220:1--220:13},\n     journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},\n     url = {http://doi.acm.org/10.1145/2661229.2661273},\n     year = {2014},\n     doi = {10.1145/2661229.2661273}\n   }\n```\n## License\n\nSoftware Copyright License for **non-commercial scientific research purposes**. Please read carefully\nthe [terms and conditions](./LICENSE) and any accompanying documentation before you download and/or\nuse the MoSh++ data and software, (the \"Data \u0026 Software\"), software, scripts, and animations. \nBy downloading and/or using the Data \u0026 Software (including downloading, cloning, installing, and any other use of this repository), \nyou acknowledge that you have read these terms\nand conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you\nmust not download and/or use the Data \u0026 Software. \nAny infringement of the terms of this agreement will automatically terminate\nyour rights under this [License](./LICENSE).\n\nThe software is compiled using CGAL sources following the license in [CGAL_LICENSE.pdf](CGAL_LICENSE.pdf)\n\n## Contact\n\nThe code in this repository is developed by \n[Nima Ghorbani](https://nghorbani.github.io/),\n[Naureen Mahmood](https://ps.is.tuebingen.mpg.de/person/nmahmood), and \n[Matthew Loper](https://ps.is.mpg.de/~mloper) \nwhile at [Max-Planck Institute for Intelligent Systems, Tübingen, Germany](https://is.mpg.de/person/nghorbani).\n\nIf you have any questions you can contact us at [amass@tuebingen.mpg.de](mailto:amass@tuebingen.mpg.de).\n\nFor commercial licensing, contact [ps-licensing@tue.mpg.de](mailto:ps-licensing@tue.mpg.de)\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnghorbani%2Fmoshpp","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fnghorbani%2Fmoshpp","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fnghorbani%2Fmoshpp/lists"}