https://github.com/zaccharieramzi/fastmri-reproducible-benchmark
Try several methods for MRI reconstruction on the fastmri dataset. Home to the XPDNet, runner-up of the 2020 fastMRI challenge.
https://github.com/zaccharieramzi/fastmri-reproducible-benchmark
convolutional-neural-networks fastmri fastmri-challenge fastmri-dataset mri mri-reconstruction neural-network tensorflow unrolled-reconstruction-algorithm
Last synced: 4 months ago
JSON representation
Try several methods for MRI reconstruction on the fastmri dataset. Home to the XPDNet, runner-up of the 2020 fastMRI challenge.
- Host: GitHub
- URL: https://github.com/zaccharieramzi/fastmri-reproducible-benchmark
- Owner: zaccharieramzi
- License: mit
- Created: 2019-06-21T14:36:25.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2024-02-24T12:58:46.000Z (over 1 year ago)
- Last Synced: 2025-03-30T09:32:08.625Z (7 months ago)
- Topics: convolutional-neural-networks, fastmri, fastmri-challenge, fastmri-dataset, mri, mri-reconstruction, neural-network, tensorflow, unrolled-reconstruction-algorithm
- Language: Jupyter Notebook
- Homepage: https://fastmri.org/leaderboards
- Size: 187 MB
- Stars: 157
- Watchers: 2
- Forks: 52
- Open Issues: 20
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# fastMRI reproducible benchmark
[](https://github.com/zaccharieramzi/fastmri-reproducible-benchmark/actions/workflows/test.yml?query=branch%3Amaster)
[](https://mybinder.org/v2/gh/zaccharieramzi/fastmri-reproducible-benchmark/master)The idea of this repository is to have a way to rapidly benchmark new solutions against existing reconstruction algorithms on the fastMRI dataset single-coil track.
The reconstruction algorithms implemented or adapted to the fastMRI dataset include to this day:
- zero filled reconstruction
- [LORAKS](https://www.ncbi.nlm.nih.gov/pubmed/24595341), using the [LORAKS Matlab toolbox](https://mr.usc.edu/download/LORAKS2/)
- Wavelet-based reconstruction (i.e. solving an L1-based analysis formulation optimization problem with greedy FISTA), using [pysap-mri](https://github.com/CEA-COSMIC/pysap-mri)
- U-net
- [DeepCascade net](https://arxiv.org/abs/1704.02422)
- [KIKI net](https://www.ncbi.nlm.nih.gov/pubmed/29624729)
- [Learned Primal Dual](https://arxiv.org/abs/1707.06474), adapted to MRI reconstruction
- [XPDNet](https://arxiv.org/abs/2010.07290), a modular unrolled reconstruction algorithm, in which you can plug your best denoiser.
- [NCPDNet](https://arxiv.org/abs/2101.01570), an unrolled reconstruction algorithm for non-Cartesian acquisitions, with density-compensation.All the neural networks are implemented in TensorFlow with the Keras API.
The older ones (don't judge this was the beginning of my thesis) are coded using the functional API.
The more recent ones are coded in the subclassed API and are more modular.
For the LORAKS reconstruction, you will not be able to reconstruct the proper fastMRI data because of the A/D oversampling.## Examples
Various examples are available in the `examples` folder.
- The [`mri_reconstruction_intro.ipynb`](examples/mri_reconstruction_intro.ipynb) notebook compares the performance of classical wavelet-based reconstruction (using the [pysap-mri](https://github.com/CEA-COSMIC/pysap-mri) package) and deep learning based approaches like the CascadeNet and the UPDNet on a slice from the fastMRI single coil dataset in a 2D Cartesian setting.
- The [`non_cartesian_reconstruction.ipynb`](examples/non_cartesian_reconstruction.ipynb) notebook showcases the use of the NCPDNet.These examples can be run in [binder](https://mybinder.org/v2/gh/zaccharieramzi/fastmri-reproducible-benchmark/master).
## Pretrained model checkpoints
Some model checkpoints are available in the [HuggingFace Hub](https://huggingface.co/zaccharieramzi).
You can either download them manually, clone the model repository, or use the `huggingface-hub` package like:
```python
from huggingface_hub import hf_hub_downloadREPO_ID_BASE = 'zaccharieramzi/{model_name}'
repo_id = REPO_ID_BASE.format(model_name='... the model name')
model_weights_path = hf_hub_download(
repo_id=repo_id,
filename='model_weights.h5',
)
```Use cases for each model are explained in the respective model cards on the HuggingFace Hub under the "How to use" section.
## Reconstruction settings
The main reconstruction settings are Cartesian single-coil and multi-coil 2D reconstruction with random and "periodic sampling".
These settings are covered by almost all the networks in this repo, mainly because they are the settings of the fastMRI challenge.### Other reconstruction settings
__Non-cartesian__: you can reconstruct non-Cartesian data using the [NCPDNet](https://github.com/zaccharieramzi/fastmri-reproducible-benchmark/blob/master/fastmri_recon/models/subclassed_models/ncpdnet.py).
It relies on the TensorFlow implementation of the NUFFT, [`tfkbnufft`](https://github.com/zaccharieramzi/tfkbnufft).
This network will allow you to work on 2D single-coil and multi-coil data, as well as 3D single-coil data.__Deep Image Prior__: you can also reconstruct non-Cartesian data in an untrained fashion using the [DIP model](https://github.com/zaccharieramzi/fastmri-reproducible-benchmark/blob/master/fastmri_recon/evaluate/reconstruction/dip_reconstrution.py).
This idea originated from the [Deep Image Prior](https://dmitryulyanov.github.io/deep_image_prior) paper, and was later adapted to MRI reconstruction by different works ([Accelerated MRI with untrained Neural networks](https://arxiv.org/abs/2007.02471), [Time-Dependent Deep Image Prior for Dynamic MRI](https://arxiv.org/abs/1910.01684)).
It currently is only used for 2D non-Cartesian data (primarily for computation time reasons), but you can extend it easily to 2D Cartesian data and 3D (PRs welcome).## How to train the neural networks
The scripts to train the neural networks are located in `fastmri_recon/training_scripts/`.
You just need to install the package and its dependencies:
```
pip install . &&\
pip install -r requirements.txt
```## How to write a new neural network for reconstruction
The simplest and most versatile way to write a neural network for reconstruction is to subclass the [`CrossDomainNet` class](fastmri_recon/models/subclassed_models/cross_domain.py).
An example is the [`PDnet`](fastmri_recon/models/subclassed_models/pdnet.py)## Reproducing the results of the paper
To reproduce the results of the paper for the fastMRI dataset, run the following script (here for the PD contrast):
```
python fastmri_recon/evaluate/scripts/paper_eval.py --contrast CORPD_FBK
```To reproduce the results of the paper for the OASIS dataset, run the following script (here for fewer samples):
```
python fastmri_recon/evaluate/scripts/paper_eval_oasis.py --n-samples 100
```Finally, to reproduce the figures of the paper, you will need to use the [`qualitative_validation_for_net`](https://github.com/zaccharieramzi/fastmri-reproducible-benchmark/blob/master/experiments/qualitative_validation_for_net.ipynb) notebook.
### Downloading the model checkpoints
The model checkpoints are stored in the [HuggingFace Hub](https://huggingface.co/zaccharieramzi).
You can download them using the following script, which will automatically put them in the correct directory (for example here the fastMRI models):
```
python fastmri_recon/evaluate/scripts/download_checkpoints.py
```# Data requirements
## fastMRI
The fastMRI data must be located in a directory whose path is stored in the `FASTMRI_DATA_DIR` environment variable.
It can be downloaded on [the official website](https://fastmri.med.nyu.edu/) after submitting a request (bottom of the page).The package currently supports public single coil and multi coil knee data.
## OASIS
The OASIS data must be located in a directory whose path is stored in the `OASIS_DATA_DIR` environment variable.
It can be downloaded on [the XNAT store](https://central.xnat.org/app/template/Index.vm) after creating an account.
You can for example use the ZIP download available on [this page](https://central.xnat.org/app/action/ProjectDownloadAction/project/OASIS3).
The project is OASIS3.
The whole list of sessions that need to be downloaded is available in [OASIS_list.csv](OASIS_list.csv).# Citation
This work will be presented at the International Symposium on Biomedical Imaging (ISBI) in April 2020.
An extended version has been published in MDPI Applied sciences.
If you use this package or parts of it, please cite one of the following work:
- [Benchmarking Deep Nets MRI Reconstruction Models on the FastMRI Publicly Available Dataset](https://hal.inria.fr/hal-02436223)
- [Benchmarking MRI Reconstruction Neural Networks on Large Public Datasets](https://www.mdpi.com/2076-3417/10/5/1816)
- [XPDNet for MRI Reconstruction: an Application to the fastMRI 2020 Brain Challenge](https://arxiv.org/abs/2010.07290)
- [Density Compensated Unrolled Networks for Non-Cartesian MRI Reconstruction](https://arxiv.org/abs/2101.01570)## Applications
This package has been used to perform MRI reconstruction in the following projects (in addition to the ones mentioned above):
- [Is good old GRAPPA dead?](https://arxiv.org/abs/2106.00753)
- [Learning the sampling density in 2D SPARKLING MRI acquisition for optimized image reconstruction](https://arxiv.org/abs/2103.03559)
- [Denoising Score-Matching for Uncertainty Quantification in Inverse Problems](https://arxiv.org/abs/2011.08698)