{"id":29458319,"url":"https://github.com/visionxlab/pwood","last_synced_at":"2025-07-13T23:37:21.926Z","repository":{"id":302772474,"uuid":"947722803","full_name":"VisionXLab/PWOOD","owner":"VisionXLab","description":null,"archived":false,"fork":false,"pushed_at":"2025-07-04T06:17:34.000Z","size":11525,"stargazers_count":1,"open_issues_count":0,"forks_count":0,"subscribers_count":1,"default_branch":"HBox","last_synced_at":"2025-07-04T06:33:42.348Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/VisionXLab.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-03-13T06:29:21.000Z","updated_at":"2025-07-04T06:17:37.000Z","dependencies_parsed_at":"2025-07-04T06:44:05.572Z","dependency_job_id":null,"html_url":"https://github.com/VisionXLab/PWOOD","commit_stats":null,"previous_names":["visionxlab/pwood"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/VisionXLab/PWOOD","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/VisionXLab%2FPWOOD","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/VisionXLab%2FPWOOD/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/VisionXLab%2FPWOOD/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/VisionXLab%2FPWOOD/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/VisionXLab","download_url":"https://codeload.github.com/VisionXLab/PWOOD/tar.gz/refs/heads/HBox","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/VisionXLab%2FPWOOD/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":265222983,"owners_count":23730327,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2025-07-13T23:37:12.141Z","updated_at":"2025-07-13T23:37:21.916Z","avatar_url":"https://github.com/VisionXLab.png","language":"Python","readme":"\n\u003cp align=\"center\"\u003e\n    \u003ch1 align=\"center\"\u003ePartial Weakly-Supervised Oriented Object Detection\u003c/h1\u003e\n    \u003cp align=\"center\"\u003e\n    \u003ca href='' style='text-decoration: none' \u003eMingxin Liu\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='https://scholar.google.com/citations?user=rQbW67AAAAAJ' style='text-decoration: none' \u003ePeiyuan Zhang\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='' style='text-decoration: none' \u003eYuan Liu\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='' style='text-decoration: none' \u003eWei Zhang\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='https://scholar.google.com/citations?user=v-aQ8GsAAAAJ' style='text-decoration: none' \u003eYue Zhou\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='' style='text-decoration: none' \u003eNing Liao\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='' style='text-decoration: none' \u003eZiyang Gong\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='https://scholar.google.com/citations?user=6XibZaYAAAAJ' style='text-decoration: none' \u003eJunwei Luo\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='' style='text-decoration: none' \u003eZhirui Wang\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='https://scholar.google.com/citations?user=OYtSc4AAAAAJ' style='text-decoration: none' \u003eYi Yu\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003ca href='https://yangxue.site/' style='text-decoration: none' \u003eXue Yang\u003c/a\u003e\u003csup\u003e\u003c/sup\u003e\u0026ensp;\n    \u003cdiv align=\"center\"\u003e\n      \u003ca href='https://arxiv.org/abs/2507.02751'\u003e\u003cimg src='https://img.shields.io/badge/arXiv-2502.04268-brown.svg?logo=arxiv\u0026logoColor=white'\u003e\u003c/a\u003e\n     \u003ca href='https://huggingface.co/Xm4nQ8/weight'\u003e\u003cimg src='https://img.shields.io/badge/HuggingFace-Model-yellow.svg?logo=HuggingFace\u0026logoColor=white'\u003e\u003c/a\u003e\n\t  \u003c/div\u003e\n    \u003c/div\u003e\n     \u003c/p\u003e\n\u003c/p\u003e\n\n## Introduction\nWe propose the first Partial Weakly-Supervised Oriented Object Detection (PWOOD) framework based on partially weak annotations (horizontal boxes or single points), which can efficiently leverage large amounts of unlabeled data, significantly outperforming weakly supervised algorithms trained with partially weak annotations, and also offers a lower cost solution.\n\n\u003cimg src=\"fig/pipline.png\" alt=\"framework\" width=\"100%\" /\u003e\n\n## Installation\n``` shell\nconda create -n mm python==3.8 -y\nconda activate mm\n\npip install torch==1.13.1+cu116 torchvision==0.14.1+cu116 torchaudio==0.13.1 --extra-index-url https://download.pytorch.org/whl/cu116\npip install -U openmim\nmim install mmcv-full\nmim install mmdet\\\u003c3.0.0\n\npip install scikit-learn\npip install prettytable\n\n# For Point branch\ncd mmrotate\npip install -v -e .\n```\n\n## Data Preparation\n### DOTA\n#### 1. Labeled/Unlabeled Data Division\nTo divide the DOTA- v1.0/v1.5 dataset into labeled and unlabeled data, please refer to [Data preparation of SOOD\n](https://github.com/HamPerdredes/SOOD).\n\nTo divide the DOTA- v2.0 into labeled and unlabeled data, please refer to [data_list/dotav2](https://github.com/123sio/PWOOD/tree/HBox/data_list/dotav2).\n\n#### 2. Data Split\nFor details on how to split the DOTA dataset into patches, please refer to the [official implementation](https://github.com/open-mmlab/mmrotate/blob/main/tools/data/dota/README.md) .\n\nAfter split, the data folder should be organized as follows:\n``` \nsplit_ss_dota_vxx\n├── train\n│   ├── images\n│   └── annfiles\n├── val\n│   ├── images\n│   └── annfiles\n├── train_xx_labeled\n│   ├── images\n│   └── annfiles\n└──train_xx_unlabeled\n    ├── images\n    └── annfiles\n```\n\n### DIOR\nTo divide the DIOR into labeled and unlabeled data, please refer to [data_list/dior](https://github.com/123sio/PWOOD/tree/HBox/data_list/dior).\n\n## Train\n```bash\n#2 GPU\nCUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nnodes=1 \\\n--node_rank=0 --master_addr=\"127.0.0.1\" --nproc_per_node=2 --master_port=25510 \\\ntrain.py configs_dota15/xxx/xxx.py \\\n--launcher pytorch \\\n--work-dir work_dir/xxx/\n\n```\n\n## Test\n```bash\npython test.py configs_dota15/xxx/xxx.py work_dir/xxx/xxx.pth \n```\n\n## Weight\n\n### DOTA- v1.0\nLabeled Data | mAP | Config | Model | Log |\n| :-----------: | :--: |:-----: | :----: | :-----:|\n| 20% | 62.93 | [semi_h2rv2_adamw_dotav1_20p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dotav1/semi_h2rv2_adamw_dotav1_20p.py) | [best_0.629314_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dota1_0/20p/best_0.629314_mAP.pth) | [dotav1_20p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dota1_0/20p/20250303_111353.log.json) | \n| 30% | 65.42 | [semi_h2rv2_adamw_dotav1_30p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dotav1/semi_h2rv2_adamw_dotav1_30p.py) | [best_0.654153_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dota1_0/30p/best_0.654153_mAP.pth) | [dotav1_30p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dota1_0/30p/20250310_193742.log.json) | \n\n### DOTA- v1.5\nLabeled Data | mAP | Config | Model | Log |\n| :-----------: | :--: |:-----: | :----: | :-----:|\n| 10% | 52.87 | [semi_h2rv2_adamw_dota15_10p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/semi_h2rv2_adamw_dota15_10p.py) | [best_0.528748_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/gmm/10p_lr/best_0.528748_mAP.pth) | [dotav15_10p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/gmm/10p_lr/20250219_224359.log.json) | \n| 20% | 59.36 | [semi_h2rv2_adamw_dota15_20p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/semi_h2rv2_adamw_dota15_20p.py) | [best_0.593614_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/gmm/best_0.593614_mAP.pth) | [dotav15_20p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/gmm/20250217_202030.log.json) | \n| 30% | 61.58 | [semi_h2rv2_adamw_dota15_30p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/semi_h2rv2_adamw_dota15_30p.py) | [best_0.615836_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/gmm/30p_lr/best_0.615836_mAP.pth) | [dotav15_30p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/gmm/30p_lr/20250219_223950.log.json) | \n\n### DOTA- v2.0\nLabeled Data | mAP | Config | Model | Log |\n| :-----------: | :--: |:-----: | :----: | :-----:|\n| 10% | 31.30 | [semi_h2rv2_adamw_dota2_10p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dotav2/semi_h2rv2_adamw_dota2_10p.py)| [best_0.310266_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dotav2/10p/best_0.310266_mAP.pth)| [dotav2_10p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dotav2/10p/20250313_224047.log.json)|\n| 20% | 36.39 | [semi_h2rv2_adamw_dota2_20p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dotav2/semi_h2rv2_adamw_dota2_20p.py)| [best_0.363926_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dotav2/best_0.363926_mAP.pth) | [dotav2_20p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dotav2/20250304_174131.log.json) |\n| 30% | 40.27 | [semi_h2rv2_adamw_dota2_30p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dotav2/semi_h2rv2_adamw_dota2_30p.py)| [best_0.402659_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dotav2/30p/pro_data/best_0.402659_mAP.pth) | [dotav2_30p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dotav2/30p/pro_data/20250321_142715.log.json) |\n\n### DIOR\nLabeled Data | mAP | Config | Model | Log |\n| :-----------: | :--: |:-----: | :----: | :-----:|\n| 10% | 54.33 | [semi_h2rv2_adamw_dior_10p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dior/semi_h2rv2_adamw_dior_10p.py) | [best_0.543296_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dior/gmm/10p/best_0.543296_mAP.pth) | [doir_10p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dior/gmm/10p/20250227_202752.log.json) | \n| 20% | 57.89 | [semi_h2rv2_adamw_dior_20p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dior/semi_h2rv2_adamw_dior_20p.py) | [best_0.578923_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dior/gmm/20p/best_0.578923_mAP.pth) | [dior_20p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dior/gmm/20p/20250227_205200.log.json) | \n| 30% | 60.42 | [semi_h2rv2_adamw_dior_30p.py](https://github.com/123sio/PWOOD/blob/HBox/configs_dota15/pwood/dior/semi_h2rv2_adamw_dior_30p.py) | [best_0.604248_mAP.pth](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dior/gmm/30p/best_0.604248_mAP.pth) | [dior_30p_log](https://huggingface.co/Xm4nQ8/weight/blob/main/work_dir_h/PWOOD/dior/gmm/30p/20250301_071406.log.json) | \n\n## Guide \nIf you need Point version, please switch to the [Point branch](https://github.com/123sio/PWOOD/tree/Point).\n\n## Citation\n```bibtex\n@misc{liu2025partialweaklysupervisedorientedobject,\n      title={Partial Weakly-Supervised Oriented Object Detection}, \n      author={Mingxin Liu and Peiyuan Zhang and Yuan Liu and Wei Zhang and Yue Zhou and Ning Liao and Ziyang Gong and Junwei Luo and Zhirui Wang and Yi Yu and Xue Yang},\n      year={2025},\n      eprint={2507.02751},\n      archivePrefix={arXiv},\n      primaryClass={cs.CV},\n      url={https://arxiv.org/abs/2507.02751}, \n}\n\n```\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvisionxlab%2Fpwood","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fvisionxlab%2Fpwood","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fvisionxlab%2Fpwood/lists"}