Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/kisonho/magnet
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
https://github.com/kisonho/magnet
computer-vision machine-learning medical-imaging segmentation self-distillation
Last synced: 29 days ago
JSON representation
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
- Host: GitHub
- URL: https://github.com/kisonho/magnet
- Owner: kisonho
- License: apache-2.0
- Created: 2022-10-06T20:05:53.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2024-07-02T15:28:21.000Z (6 months ago)
- Last Synced: 2024-11-28T11:07:51.221Z (about 1 month ago)
- Topics: computer-vision, machine-learning, medical-imaging, segmentation, self-distillation
- Language: Python
- Homepage:
- Size: 369 KB
- Stars: 4
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Citation: CITATION.cff
Awesome Lists containing this project
README
# Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
This is the official implementation of **[MAG-MS](https://arxiv.org/pdf/2306.03730)**.For the official implementation of **[MAGNET: A Modality-Agnostic Networks for Medical Image Segmentation](https://ieeexplore.ieee.org/document/10230587)**, please check to branch [stable-1.1](https://github.com/kisonho/magnet/tree/stable-1.1).
MAG-MS is designed to be compatible with MAGNET (v1). The new MAGNET (v2) used in MAG-MS is designed to support multi-modality self-distillation and multi-modality feature distillation.
![](res/structure_v2.jpg)
## Pre-request
* Python >= 3.9
* [PyTorch](https://pytorch.org) >= 1.12.1
* [torchmanager](https://github.com/kisonho/torchmanager) >= 1.1
* [Monai](https://monai.io) >= 1.1## Installation
Use the package manager [pip](https://pip.pypa.io/en/stable/) to install MAG-MS.
```bash
pip install magms
```## Get Started
1. Load datasets
```python
training_dataset = ...
validation_dataset = ...
```2. Simpy build the MAGNET (UNETR backbone) with `magnet.build` function, or use the `magnet.build_v2` (UNETR backbone)/`magnet.build_v2_unet` (3D UNet backbone) function for the new MAGNET used in MAG-MS
```python
num_modalities: int = ...
num_classes: int = ...
img_size: Union[int, Sequence[int]] = ...
model = magnet.build_v2(num_modalities, num_classes, img_size, target_dict=target_dict)
```3. Or use the deeper `magnet.nn` framework to customize MAGNET backbone
```python
encoder1: torch.nn.Module = ...
encoder2: torch.nn.Module = ...
fusion: torch.nn.Module = ...
decoder: torch.nn.Module = ...
model = magnet.nn.MAGNET2(encoder1, encoder2, fusion=fusion, decoder=decoder)
```4. Define MAGMS loss function
```python
main_loss_fn: list[Callable[[Any, Any], torch.Tensor]] = ...
kldiv_loss_fn: list[Callable[[Any, Any], torch.Tensor]] = ...
mse_loss_fn: list[Callable[[Any, Any], torch.Tensor]] = ...
self_distillation_loss_fn = magnet.losses.MAGSelfDistillationLoss(main_loss_fn, kldiv_loss_fn)
feature_distillation_loss_fn = magnet.losses.MAGFeatureDistillationLoss(self_distillation_loss_fn, mse_loss_fn)
loss_fn = feature_distillation_loss_fn
```5. Compile manager and train/test
```python
optimizer = ...
metric_fns = ...epochs = ...
callbacks = ...manager = magnet.Manager(model, optimizer, loss_fn=loss_fn, metric_fns=metric_fns)
manager.fit(training_dataset, epochs, val_dataset=validation_dataset, callbacks=callbacks)
summary.test(validation_dataset)
print(summary)
```## Monai Support
* Using `magnet.MonaigManager` instead of `Manager`
* Post processing support with `post_labels` and `post_predicts`
```python
post_labels = [...]
post_predicts = [...]manager = magnet.MonaigManager(model, post_labels=post_labels, post_predicts=post_predicts, optimizer=optimizer, loss_fn=loss_fn, metric_fns=metric_fns)
```## Cite this work
```bibtex
@article{he2023modality,
title={Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation},
author={He, Qisheng and Summerfield, Nicholas and Dong, Ming and Glide-Hurst, Carri},
journal={arXiv preprint arXiv:2306.03730},
year={2023}
}
```