{"id":13790441,"url":"https://github.com/microsoft/MeshGraphormer","last_synced_at":"2025-05-12T09:32:43.665Z","repository":{"id":39833761,"uuid":"391487716","full_name":"microsoft/MeshGraphormer","owner":"microsoft","description":"Research code of ICCV 2021 paper \"Mesh Graphormer\"","archived":false,"fork":false,"pushed_at":"2023-07-06T22:06:34.000Z","size":3563,"stargazers_count":415,"open_issues_count":19,"forks_count":55,"subscribers_count":9,"default_branch":"main","last_synced_at":"2025-05-07T23:47:05.980Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://arxiv.org/abs/2104.00272","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/microsoft.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":"docs/CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":"docs/SECURITY.md","support":"docs/SUPPORT.md","governance":null,"roadmap":null,"authors":null}},"created_at":"2021-08-01T00:22:13.000Z","updated_at":"2025-05-07T05:21:35.000Z","dependencies_parsed_at":"2024-01-07T04:49:32.493Z","dependency_job_id":null,"html_url":"https://github.com/microsoft/MeshGraphormer","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/microsoft%2FMeshGraphormer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/microsoft%2FMeshGraphormer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/microsoft%2FMeshGraphormer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/microsoft%2FMeshGraphormer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/microsoft","download_url":"https://codeload.github.com/microsoft/MeshGraphormer/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253709347,"owners_count":21951123,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-03T22:00:43.903Z","updated_at":"2025-05-12T09:32:41.116Z","avatar_url":"https://github.com/microsoft.png","language":"Python","readme":"# MeshGraphormer ✨✨\n\n\nThis is our research code of [Mesh Graphormer](https://arxiv.org/abs/2104.00272). \n\nMesh Graphormer is a new transformer-based method for human pose and mesh reconsruction from an input image. In this work, we study how to combine graph convolutions and self-attentions in a transformer to better model both local and global interactions. \n\n \u003cimg src=\"docs/graphormer_overview.png\" width=\"650\"\u003e \n\n \u003cimg src=\"docs/Fig1.gif\" width=\"200\"\u003e \u003cimg src=\"docs/Fig2.gif\" width=\"200\"\u003e \u003cimg src=\"docs/Fig3.gif\" width=\"200\"\u003e  \u003cimg src=\"docs/Fig4.gif\" width=\"200\"\u003e \n\n \u003cimg src=\"https://datarelease.blob.core.windows.net/metro/graphormer_demo.gif\" width=\"200\"\u003e\n\n## Installation\nCheck [INSTALL.md](docs/INSTALL.md) for installation instructions.\n\n\n## Model Zoo and Download\nPlease download our pre-trained models and other relevant files that are important to run our code. \n\nCheck [DOWNLOAD.md](docs/DOWNLOAD.md) for details. \n\n## Quick demo\nWe provide demo codes to run end-to-end inference on the test images.\n\nCheck [DEMO.md](docs/DEMO.md) for details.\n\n## Experiments\nWe provide python codes for training and evaluation.\n\nCheck [EXP.md](docs/EXP.md) for details.\n\n\n## License\n\nOur research code is released under the MIT license. See [LICENSE](LICENSE) for details. \n\nWe use submodules from third parties, such as [huggingface/transformers](https://github.com/huggingface/transformers) and [hassony2/manopth](https://github.com/hassony2/manopth). Please see [NOTICE](NOTICE.md) for details. \n\nOur models have dependency with SMPL and MANO models. Please note that any use of SMPL models and MANO models are subject to **Software Copyright License for non-commercial scientific research purposes**. Please see [SMPL-Model License](https://smpl.is.tue.mpg.de/modellicense) and [MANO License](https://mano.is.tue.mpg.de/license) for details.\n\n\n## Contributing \n\nWe welcome contributions and suggestions. Please check [CONTRIBUTE](docs/CONTRIBUTE.md) and [CODE_OF_CONDUCT](docs/CODE_OF_CONDUCT.md) for details. \n\n\n## Citations\nIf you find our work useful in your research, please consider citing:\n\n```bibtex\n@inproceedings{lin2021-mesh-graphormer,\nauthor = {Lin, Kevin and Wang, Lijuan and Liu, Zicheng},\ntitle = {Mesh Graphormer},\nbooktitle = {ICCV},\nyear = {2021},\n}\n```\n\n\n## Acknowledgments\n\nOur implementation and experiments are built on top of open-source GitHub repositories. We thank all the authors who made their code public, which tremendously accelerates our project progress. If you find these works helpful, please consider citing them as well.\n\n[huggingface/transformers](https://github.com/huggingface/transformers) \n\n[HRNet/HRNet-Image-Classification](https://github.com/HRNet/HRNet-Image-Classification) \n\n[nkolot/GraphCMR](https://github.com/nkolot/GraphCMR) \n\n[akanazawa/hmr](https://github.com/akanazawa/hmr) \n\n[MandyMo/pytorch_HMR](https://github.com/MandyMo/pytorch_HMR) \n\n[hassony2/manopth](https://github.com/hassony2/manopth) \n\n[hongsukchoi/Pose2Mesh_RELEASE](https://github.com/hongsukchoi/Pose2Mesh_RELEASE) \n\n[mks0601/I2L-MeshNet_RELEASE](https://github.com/mks0601/I2L-MeshNet_RELEASE) \n\n[open-mmlab/mmdetection](https://github.com/open-mmlab/mmdetection) \n","funding_links":[],"categories":["✋ Hand Pose Estimation","Single-Person 3D Mesh Recovery"],"sub_categories":["🔥 State-of-the-Art Methods (2024-2025)","2021"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmicrosoft%2FMeshGraphormer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmicrosoft%2FMeshGraphormer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmicrosoft%2FMeshGraphormer/lists"}