Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/xypu98/CWSAM
https://github.com/xypu98/CWSAM
Last synced: about 1 month ago
JSON representation
- Host: GitHub
- URL: https://github.com/xypu98/CWSAM
- Owner: xypu98
- Created: 2023-11-08T05:26:31.000Z (about 1 year ago)
- Default Branch: master
- Last Pushed: 2024-01-05T03:04:47.000Z (about 1 year ago)
- Last Synced: 2024-01-05T04:22:50.783Z (about 1 year ago)
- Language: Python
- Size: 208 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- Awesome-Segment-Anything - [code
README
## ClassWise-SAM-Adapter: Parameter Efficient Fine-tuning Adapts Segment Anything to SAR Domain for Semantic Segmentation
Xinyang Pu, Hecheng Jia, Linghao Zheng, Feng Wang, Feng Xu
The Key Laboratory of Information Science of Electromagnetic Waves, Fudan University, Shanghai, China
## Environment
This code was implemented with Python 3.8 and PyTorch 1.13.0. You can install all the requirements via:
```bash
pip install -r requirements.txt
```## Quick Start
1. Prepare the dataset.
2. Download the pre-trained [SAM(Segment Anything)](https://github.com/facebookresearch/segment-anything) and put it in ./pretrained.
3. Training:
```bashCUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nnodes 1 --nproc_per_node 2 train.py --config [CONFIG_PATH]
```
4. Evaluation:
```bash
python test.py --config [CONFIG_PATH] --model [MODEL_PATH]
```## Train
```bash
CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nnodes 1 --nproc_per_node 2 train.py --config [CONFIG_PATH]CUDA_VISIBLE_DEVICES=1 python -m torch.distributed.launch --nnodes 1 --nproc_per_node 1 train.py --master_port='29600' --config [CONFIG_PATH]
CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nnodes 1 --nproc_per_node 2 train.py --config configs_git/fusar-sar-map2-sam-vit-b-10cls-ce-trainval_1024_lr2e4_CE_e200.yaml
CUDA_VISIBLE_DEVICES=0,1 python -m torch.distributed.launch --nnodes 1 --nproc_per_node 2 train.py --config configs_git/fusar-sar-map-sam-vit-b-5cls-ce-trainval_1024_lr2e4_CEv2_e200_ignore_bg.yaml
```
## Test
```bash
python test.py --config [CONFIG_PATH] --model [MODEL_PATH]export CUDA_VISIBLE_DEVICES=2
python test.py --config configs/fusar-sar-map-sam-vit-b-5cls-ce-trainval_1024_lr2e4_CEv2_e200_ignore_bg.yaml --model ./save/fusar-sar-map-sam-vit-b-5cls-ce-trainval_1024_lr2e4_CEv2_e200_ignore_bg/model_epoch_best.pth```
## CitationIf you find our work useful in your research, please consider citing:
```
@misc{pu2024classwisesamadapter,
title={ClassWise-SAM-Adapter: Parameter Efficient Fine-tuning Adapts Segment Anything to SAR Domain for Semantic Segmentation},
author={Xinyang Pu and Hecheng Jia and Linghao Zheng and Feng Wang and Feng Xu},
year={2024},
eprint={2401.02326},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```## Acknowledgements
The part of the code is derived from SAM-adapter: KOKONI, Moxin Technology (Huzhou) Co., LTD , Zhejiang University, Singapore University of Technology and Design, Huzhou University, Beihang University.