{"id":20149909,"url":"https://github.com/taited/sgdiff","last_synced_at":"2025-04-09T20:11:26.527Z","repository":{"id":188655746,"uuid":"679164395","full_name":"Taited/sgdiff","owner":"Taited","description":"Official implementation of SGDiff (ACM MM '23)","archived":false,"fork":false,"pushed_at":"2023-11-26T17:48:37.000Z","size":32901,"stargazers_count":33,"open_issues_count":3,"forks_count":3,"subscribers_count":5,"default_branch":"main","last_synced_at":"2025-04-09T20:11:19.895Z","etag":null,"topics":["diffusion","fashion","glide","multimedia","sgdiff","style","style-transfer"],"latest_commit_sha":null,"homepage":"https://taited.github.io/sgdiff-project","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Taited.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null}},"created_at":"2023-08-16T08:33:02.000Z","updated_at":"2025-01-28T10:23:54.000Z","dependencies_parsed_at":null,"dependency_job_id":"6ce77c98-c1ca-4e76-b110-e1a9a03369ee","html_url":"https://github.com/Taited/sgdiff","commit_stats":null,"previous_names":["taited/sgdiff"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Taited%2Fsgdiff","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Taited%2Fsgdiff/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Taited%2Fsgdiff/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Taited%2Fsgdiff/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Taited","download_url":"https://codeload.github.com/Taited/sgdiff/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248103872,"owners_count":21048245,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["diffusion","fashion","glide","multimedia","sgdiff","style","style-transfer"],"created_at":"2024-11-13T22:47:21.668Z","updated_at":"2025-04-09T20:11:26.505Z","avatar_url":"https://github.com/Taited.png","language":"Jupyter Notebook","readme":"# Official Implementation of SGDiff (ACM MM '23)\n\n\u003ca href=\"https://taited.github.io/sgdiff-project\" target=\"_blank\"\u003e\n  \u003cimg src=\"https://img.shields.io/badge/Project-Page-Green\"\u003e\n\u003c/a\u003e\n\u003ca href=\"https://arxiv.org/abs/2308.07605\" target=\"_blank\"\u003e\n  \u003cimg src=\"https://img.shields.io/badge/Paper-Arxiv-red\"\u003e\n\u003c/a\u003e\n\nThis is the official implementation of SGDiff: A Style Guided Diffusion Model for Fashion Synthesis (ACM MM '23). SGDiff is developed based on the MMagic framework (version V1.1.0). The training scripts and dataset used in this paper will be released soon.\n\n## Todo List\nTo ensure reproducibility, this project was extensively re-implemented based on MMagic. We anticipate a release date for the training code and dataset in late January or early February.\n\n- [ ] Release the training scripts.\n- [ ] Make the dataset publicly available.\n\n## SG-Fashion Dataset Preview\nThe SG-Fashion Dataset collects 17,000 images of fashion products sourced from e-commerce websites such as ASOS, Uniqlo, and H\u0026M. We set aside 1,700 of these images as the test set. The dataset covers 72 product categories, encompassing a wide range of garment items.\n![SG-Fashion](/media/SG-Fashion.jpg \"Magic Gardens\")\n\n## Installation Guide\n\nTo use SGDiff, you need to install a compatible version of PyTorch with CUDA support. We recommend using PyTorch version 1.10 with CUDA 11.1. However, our codebase does not specifically depend on this exact version of PyTorch or CUDA, and other versions may also work but have not been extensively tested. Please refer to the [MMagic installation guide](https://github.com/open-mmlab/mmagic#%EF%B8%8F-installation) for more details on setting up your environment.\n\n1. (Optional if you already have)Install a compatible version of PyTorch with CUDA\n   ```bash\n   pip install torch==1.10.0+cu111 torchvision==0.11.0+cu111 torchaudio==0.10.0 -f https://download.pytorch.org/whl/torch_stable.html\n   ```\n2. MMagic dependencies\n   ```bash\n   pip3 install openmim\n   mim install mmcv\u003e=2.0.0\n   mim install mmengine\n   ```\n3. Install this repository as editable version\n   ```bash\n   git clone https://github.com/Taited/sgdiff\n   cd sgdiff\n   pip3 install -e .\n   ```\n\n## Inference Code Now Available 🔥\n\nThe inference code for SGDiff is now available in this repository.\n\nBefore running inference, download the model checkpoint from the\n[Google Drive](https://drive.google.com/drive/folders/1hnXb9PCmhXc7W05qsK69FSQzFdgIDdo9?usp=sharing).\n\nAfter downloading, you can generate images using the SGDiff model by the following command:\n\n```shell\npython inference.py --ckpt sgdiff.pth --img_path examples/starry_night.jpg --prompt \"long sleeve jumpsuit\"\n```\n| Prompt                  | sleeveless jumpsuit             | long sleeve jumpsuit             | v-neck jumpsuit                  |\n|:-----------------------:|:-------------------------------:|:-------------------------------:|:-------------------------------:|\n|                         | ![sleeveless jumpsuit](/media/sleeveless%20jumpsuit.png) | ![long sleeve jumpsuit](/media/long%20sleeve%20jumpsuit.png) | ![V-Neck jumpsuit](/media/V-Neck%20jumpsuit.png)  |\n\n\n## Citation\n\nIf this repository is helpful to your research, please cite it as below.\n\n```bibtex\n@inproceedings{10.1145/3581783.3613806,\nauthor = {Sun, Zhengwentai and Zhou, Yanghong and He, Honghong and Mok, P.Y.},\ntitle = {SGDiff: A Style Guided Diffusion Model for Fashion Synthesis},\nyear = {2023},\nisbn = {9798400701085},\npublisher = {Association for Computing Machinery},\naddress = {New York, NY, USA},\nurl = {https://doi.org/10.1145/3581783.3613806},\ndoi = {10.1145/3581783.3613806},\nbooktitle = {Proceedings of the 31st ACM International Conference on Multimedia},\npages = {8433–8442},\nnumpages = {10},\nkeywords = {style guidance, denoising diffusion probabilistic models, text-to-image, fashion synthesis},\nlocation = {Ottawa ON, Canada},\nseries = {MM '23}\n}\n\n```\n\n## Acknowledgement\n\nThis work builds upon the MMagic library. We appreciate the MMagic team for their substantial contributions to the community. For the exact version of MMagic we used (V1.1.0), please refer to their [repository](https://github.com/open-mmlab/mmagic).\n\nStay tuned for updates on the release of additional resources!\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftaited%2Fsgdiff","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ftaited%2Fsgdiff","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ftaited%2Fsgdiff/lists"}