Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/JiarunLiu/Swin-UMamba
Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining
https://github.com/JiarunLiu/Swin-UMamba
Last synced: 3 months ago
JSON representation
Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining
- Host: GitHub
- URL: https://github.com/JiarunLiu/Swin-UMamba
- Owner: JiarunLiu
- License: apache-2.0
- Created: 2024-02-04T16:55:50.000Z (12 months ago)
- Default Branch: main
- Last Pushed: 2024-03-19T15:44:28.000Z (10 months ago)
- Last Synced: 2024-08-01T04:02:10.262Z (6 months ago)
- Language: Python
- Homepage: https://arxiv.org/abs/2402.03302
- Size: 2.26 MB
- Stars: 202
- Watchers: 2
- Forks: 8
- Open Issues: 4
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-Mamba - Swin-UMamba
- Awesome-state-space-models - Vision
README
# Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining
Official repository for: *[Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining](https://arxiv.org/abs/2402.03302)*
![network](https://github.com/JiarunLiu/Swin-UMamba/blob/main/assets/swin-umamba.png)
## Main Results
- AbdomenMRI
- Endoscopy
- Microscopy
## Installation
**Step-1:** Create a new conda environment & install requirements
```shell
conda create -n swin_umamba python=3.10
conda activate swin_umambapip install torch==2.0.1 torchvision==0.15.2
pip install causal-conv1d==1.1.1
pip install mamba-ssm
pip install torchinfo timm numba
```**Step-2:** Install Swin-UMamba
```shell
git clone https://github.com/JiarunLiu/Swin-UMamba
cd Swin-UMamba/swin_umamba
pip install -e .
```## Prepare data & pretrained model
**Dataset:**
We use the same data & processing strategy following U-Mamba. Download dataset from [U-Mamba](https://github.com/bowang-lab/U-Mamba) and put them into the data folder. Then preprocess the dataset with following command:
```shell
nnUNetv2_plan_and_preprocess -d DATASET_ID --verify_dataset_integrity
```**ImageNet pretrained model:**
We use the ImageNet pretrained VMamba-Tiny model from [VMamba](https://github.com/MzeroMiko/VMamba). You need to download the model checkpoint and put it into `data/pretrained/vmamba/vmamba_tiny_e292.pth`
```
wget https://github.com/MzeroMiko/VMamba/releases/download/%2320240218/vssmtiny_dp01_ckpt_epoch_292.pth
mv vssmtiny_dp01_ckpt_epoch_292.pth data/pretrained/vmamba/vmamba_tiny_e292.pth
```## Training
Using the following command to train & evaluate Swin-UMamba
```shell
# AbdomenMR dataset
bash scripts/train_AbdomenMR.sh MODEL_NAME
# Endoscopy dataset
bash scripts/train_Endoscopy.sh MODEL_NAME
# Microscopy dataset
bash scripts/train_Microscopy.sh MODEL_NAME
```Here `MODEL_NAME` can be:
- `nnUNetTrainerSwinUMamba`: Swin-UMamba model with ImageNet pretraining
- `nnUNetTrainerSwinUMambaD`: Swin-UMamba$\dagger$ model with ImageNet pretraining
- `nnUNetTrainerSwinUMambaScratch`: Swin-UMamba model without ImageNet pretraining
- `nnUNetTrainerSwinUMambaDScratch`: Swin-UMamba$\dagger$ model without ImageNet pretrainingYou can download our model checkpoints [here](https://drive.google.com/drive/folders/1zOt0ZfQPjoPdY37NfLKevYs4x5eClThN?usp=sharing).
## Acknowledgements
We thank the authors of [nnU-Net](https://github.com/MIC-DKFZ/nnUNet), [Mamba](https://github.com/state-spaces/mamba), [UMamba](https://github.com/bowang-lab/U-Mamba), [VMamba](https://github.com/MzeroMiko/VMamba), and [Swin-Unet](https://github.com/HuCaoFighting/Swin-Unet) for making their valuable code & data publicly available.
## Citation
```
@article{Swin-UMamba,
title={Swin-UMamba: Mamba-based UNet with ImageNet-based pretraining},
author={Jiarun Liu and Hao Yang and Hong-Yu Zhou and Yan Xi and Lequan Yu and Yizhou Yu and Yong Liang and Guangming Shi and Shaoting Zhang and Hairong Zheng and Shanshan Wang},
journal={arXiv preprint arXiv:2402.03302},
year={2024}
}
```