{"id":13958399,"url":"https://github.com/fengq1a0/FOF","last_synced_at":"2025-07-20T23:31:25.090Z","repository":{"id":65658534,"uuid":"552577960","full_name":"fengq1a0/FOF","owner":"fengq1a0","description":null,"archived":false,"fork":false,"pushed_at":"2024-07-31T15:35:53.000Z","size":5484,"stargazers_count":83,"open_issues_count":7,"forks_count":8,"subscribers_count":8,"default_branch":"main","last_synced_at":"2024-08-09T13:18:45.906Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/fengq1a0.png","metadata":{"files":{"readme":"readme.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-10-16T22:21:35.000Z","updated_at":"2024-07-31T15:35:56.000Z","dependencies_parsed_at":"2024-07-31T19:24:10.284Z","dependency_job_id":"794a099a-c89f-4dd4-8cd4-cb62607f9ba1","html_url":"https://github.com/fengq1a0/FOF","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/fengq1a0%2FFOF","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/fengq1a0%2FFOF/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/fengq1a0%2FFOF/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/fengq1a0%2FFOF/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/fengq1a0","download_url":"https://codeload.github.com/fengq1a0/FOF/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":226845016,"owners_count":17691141,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-08T13:01:31.709Z","updated_at":"2025-07-20T23:31:25.083Z","avatar_url":"https://github.com/fengq1a0.png","language":"Python","readme":"# FOF-X: Towards Real-time Detailed Human Reconstruction from a Single Image\r\n[website](https://cic.tju.edu.cn/faculty/likun/projects/FOFX/index.html) | [arxiv](https://arxiv.org/pdf/2412.05961)\r\n\r\n# FOF: Learning Fourier Occupancy Field for Monocular Real-time Human Reconstruction\r\n[website](https://cic.tju.edu.cn/faculty/likun/projects/FOF/index.html) | [paper](https://cic.tju.edu.cn/faculty/likun/projects/FOF/imgs/FOF_paper.pdf)\r\n\r\n## Environment\r\nTo run this code, follow the following instructions.\r\n```\r\nconda create -n fof python=3.10\r\nconda activate fof\r\n\r\n# visit https://pytorch.org/\r\n# install latest pytorch\r\n# for example:\r\n# conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia\r\n\r\npip install tqdm\r\npip install opencv-python\r\npip install open3d\r\npip install scikit-image\r\n```\r\n\r\n\r\nYou will need the cuda corresponding to your pytorch to compile the tools below. \r\n```\r\npip install Cython\r\npip install pybind11\r\n# 1. fof\r\ncd lib/fof\r\nsh compile.sh\r\ncd ../..\r\n\r\n# 2. mc\r\n# You need Eigen to compile this.\r\ncd lib/mc\r\nsh compile.sh\r\ncd ../..\r\n```\r\nCheck train.py and test.py, have fun with this repo!\r\n## Checkpoints\r\nYou can get pretrained checkpoints [here](https://drive.google.com/drive/folders/1ocS0YND9vtdFN8Z99BoUdPu-ktSUwt5x?usp=sharing).\r\nPut them in ```./ckpt```.  They should be organized as shown below:\r\n```\r\nFOF\r\n├─lib\r\n└─ckpt\r\n    ├─model32.pth\r\n    ├─model48.pth\r\n    ├─netB.pth\r\n    └─netF.pth\r\n```\r\n```model32.pth``` and ```model48.pth``` correspond to HR-Net-32 and HR-Net-48 version, respectively. Both of them are trained on THuman2.0 dataset only.\r\n\r\n## Data preprocess\r\nYou will need triangle mesh and corresponding SMPL-X mesh to train the model. The texture is not needed, and the meshes don't have to be watertight (which is better)! However, we prefer the meshes are clean, without floating fragments. Check the data preprocess code ```preprocess.py``` for more details. You can see the seven steps we do in the file. Before running it, you will need to\r\n```\r\npip install cyminiball\r\n```\r\nCheck the code, you can use the function ```work``` to process you data. If you are using THuman2.0 dataset, you can organize your data as shown below.\r\n\r\nYour THuman2.0 mesh or your scanned mesh. (Only xxxx.obj is used)\r\n```\r\n$base_mesh        # line 139 in preprocess.py\r\n├───0000\r\n|   ├─0000.obj\r\n|   ├─material0.jpeg\r\n|   └─material0.mtl\r\n├───0001\r\n├───0002\r\n...\r\n└───0525\r\n```\r\n\r\nCorresponding SMPL-X mesh. (SMPL is also OK, but you will need to modify the dataloader yourself. Just a few lines.)\r\n```\r\n$base_smpl        # line 140 in preprocess.py\r\n├───0000\r\n|   ├─mesh_smplx.obj\r\n|   └─smplx_param.pkl\r\n├───0001\r\n├───0002\r\n...\r\n└───0525\r\n```\r\n\r\nThe path you want to store the processed data. Dataloader also read data from it. You don't need to create it. After running ```preprocess.py```, it looks like: \r\n```\r\n$base_out         # line 141 in preprocess.py\r\n├───100k-180\r\n|   ├─0000.ply\r\n|   ├─0001.ply\r\n|   ...\r\n|   └─0525.ply\r\n├───smpl-180\r\n|   ├─0000.ply\r\n|   ├─0001.ply\r\n|   ...\r\n|   └─0525.ply\r\n└───train_th2.txt\r\n```\r\n```train_th2.txt``` lists the meshes used for training. It should look like:\r\n```\r\n0000\r\n0001\r\n0008\r\n...\r\n```\r\nYou can check the ```.ply``` meshes in meshlab. Make sure they are well-aligned.\r\n\r\nFinally, you can change the ```line 26``` and ```line 27``` in the ```lib\\dataset_mesh_only.py``` to the right path. Alternatively, you can identify them when using the TrainSet class. Now you can enjoy your training!\r\n\r\n## Citation\r\n\r\nIf you find this code useful for your research, please use the following BibTeX entry.\r\n\r\n```\r\n@inproceedings{li2022neurips,\r\n  author = {Qiao Feng and Yebin Liu and Yu-Kun Lai and Jingyu Yang and Kun Li},\r\n  title = {FOF: Learning Fourier Occupancy Field for Monocular Real-time Human Reconstruction},\r\n  booktitle = {NeurIPS},\r\n  year={2022},\r\n}\r\n```\r\n```\r\n@inproceedings{fofx,\r\n  author = {Qiao Feng and Yuanwang Yang and Yebin Liu and Yu-Kun Lai and Jingyu Yang and Kun Li},\r\n  title = {FOF-X: Towards Real-time Detailed Human Reconstruction from a Single Image},\r\n  booktitle = {ArXiv:2412.05961},\r\n  year={2024},\r\n}\r\n```\r\n","funding_links":[],"categories":["人像\\姿势\\3D人脸"],"sub_categories":["网络服务_其他"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffengq1a0%2FFOF","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ffengq1a0%2FFOF","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffengq1a0%2FFOF/lists"}