{"id":18276251,"url":"https://github.com/hilab-git/abcs_2020","last_synced_at":"2025-07-02T03:04:40.974Z","repository":{"id":108580271,"uuid":"446106393","full_name":"HiLab-git/ABCs_2020","owner":"HiLab-git","description":null,"archived":false,"fork":false,"pushed_at":"2022-01-19T12:56:20.000Z","size":2770,"stargazers_count":2,"open_issues_count":1,"forks_count":1,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-04-09T04:17:19.865Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/HiLab-git.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-01-09T14:08:47.000Z","updated_at":"2024-08-15T09:42:10.000Z","dependencies_parsed_at":"2023-04-23T15:31:49.622Z","dependency_job_id":null,"html_url":"https://github.com/HiLab-git/ABCs_2020","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/HiLab-git/ABCs_2020","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HiLab-git%2FABCs_2020","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HiLab-git%2FABCs_2020/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HiLab-git%2FABCs_2020/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HiLab-git%2FABCs_2020/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/HiLab-git","download_url":"https://codeload.github.com/HiLab-git/ABCs_2020/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HiLab-git%2FABCs_2020/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":263066557,"owners_count":23408387,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-05T12:15:33.862Z","updated_at":"2025-07-02T03:04:40.947Z","avatar_url":"https://github.com/HiLab-git.png","language":"Python","readme":"# The Second Place of MICCAI 2020 ABCs Challenge\n[nnUNet_link]:https://github.com/MIC-DKFZ/nnUNetdescribe\n[PyMIC_link]:https://github.com/HiLab-git/PyMIC\n[ABCs_link]:https://abcs.mgh.harvard.edu/\nThis repository provides source code for MICCAI 2020 Anatomical brainBarriers to Cancer spread (ABCs) challenge. Method will be briefly introduced below, and our method won the 2nd place of [ABCs](ABCs_link).\nOur method is based on [nnUNet][nnUNet_link], a self-adaptive segmentation method for medical images.\n\n\u003cimg src='./pic/ABCs.png'  width=\"1100\"\u003e\n\n\n# Method overview\nOur solution coatriaons corse and fine stage. We first use a localization model based on [nnUNet][nnUNet_link] to obtain a rough localization of five structures (Falx cerebri, Tentorium cerebelli, Sagittal \u0026 transverse brain sinuses, Cerebellum, Ventricles), and then use a High- and Multi-Resolution Network (HMRNet)  to  segment  each  structure  around  it's  local  region respectively, where a Bidirectional Feature Calibration (BFC) block  is  introduced  for  better  interaction  between  features  in the two branches. \n\u003cimg src='./pic/framework.png'  width=\"1100\"\u003e\n\u003cimg src='./pic/HMRnet.png'  width=\"1100\"\u003e\n\n## Requirements\nThis code depends on [Pytorch](https://pytorch.org) (You need at least version 1.6), [PyMIC][PyMIC_link] and [nnUNet][nnUNet_link]. To use nnUNet, Download [nnUNet][nnUNet_link], and put them in the `ProjectDir` such as `/home/jk/project/ABCs`. \n\n## Environment variable\nInstall the nnUNet and set environmental variables as follows.\n\n```bash\ncd nnUNet\npip install -e .\nexport nnUNet_raw_data_base=\"/home/jk/ABCS_data/nnUNet_raw_data_base\"\nexport nnUNet_preprocessed=\"/home/jk/ABCS_data/nnUNet_preprocessed\"\nexport RESULTS_FOLDER=\"/home/jk/ABCS_data/result\"\n```\n# Coarse stage\n### Data preparation\n* Creat a path_dic for dataset ,  like`[ABCs_data_dir]=\"/home/jk/ABCS_data\" `. Then dowload the dataset from [ABCs](ABCs_link) and put the dataset in the `ABCs_data_dir`, specifically, `ABCs_data_dir/nnUNet_raw_data/Task001_ABCs/imagesTr` for training images, `ABCs_data_dir/nnUNet_raw_data/Task001_ABCs/labelsTr` for training ground truth and `ABCs_data_dir/nnUNet_raw_data/Task001_ABCs/imagesTs` for test images. You can get more detailed guidance [here](https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/dataset_conversion.md).\n\n* Run the following commands to prepare training and testing data for nnUNet. Note that the input of nnUNet has three channels.`python  ./HNRNet/data_process/creat_data_json.py`\n### Training\nIn the coarse stage, we only need the coarse location. In order to save unnecessary time, you can change `self.max_num_epochs = 1000` to `self.max_num_epochs = 45` in `nnUNet/nnunet/training/network_training/nnUNetTrainerV2.py`.\n* Dataset conversion and preprocess. Run:\n```bash\nnnUNet_plan_and_preprocess -t 001 --verify_dataset_integrity\n```\n* Train 3D UNet. For FOLD in [0, 1, 2, 3, 4], run:\n```bash\nnnUNet_train 3d_fullres nnUNetTrainerV2  Task001_ABCs FOLD --npz\n```\n### Inference\nRun the following command:\n```bash\nnnUNet_predict -i INPUT_FOLDER -o OUTPUT_FOLDER -t 001 -m 3d_fullres -f FOLD -chk model_best \n```\n\n# Fine stage\nFirst, you need to add the HMRNet into the nnUNet framework. \n* Add `HMRNet_p.py` and `HMRNet_s.py` into the `/nnUNet/nnunet/network_architecture/`.\n\n* Replace the `nnunet/training/loss_functions/dice_loss.py` with the file `HMRNet/HMRNet/dice_loss.py`.\n\n* Replace the `nnunet/training/network_training/nnUNetTrainerV2.py` with the file `HMRNet/HMRNet/nnUNetTrainerV2.py`.\n## Data preparation and processing\n* To separate each organ independently, run: \n```bash\npython HMRNet/data_process/Extract_class_and_crop.py\n```\n* Especially, to get the  Sagittal brain sinu and Transverse brain sinus from Sagittal \u0026 transverse brain sinuses (organ3), run:\n```bash\npython HMRNet/data_process/Organ3_to_two_part.py\n```\n\nAfter extracting each organs, treated five types of organs as five tasks like `Task001_ABCs`, for example `Task002_ABCs_organ1,  Task003_ABCs_organ2` etc. And put the corresponding into the bolders.\n## Training\nTo segment each organs, you need trian six models.\nFor Cerebellum, Tentorium cerebelli and Ventricles, please set ` self.network = HMRNet_s(params) `  in  `nnunet/training/network_training/nnUNetTrainerV2.py`.\nFor Falx cerebri, Sagittal brain sinus and Transverse brain sinus, please set `self.network = HMRNet_p(params)` in `nnunet/training/network_training/nnUNetTrainerV2.py`.\nIn order to save unnecessary time, you can change self.max_num_epochs = 1000 to self.max_num_epochs = 400 in `nnUNet/nnunet/training/network_training/nnUNetTrainerV2.py`.\n* Dataset conversion and preprocess. Run:\n```bash\nnnUNet_plan_and_preprocess -t TASK_name --verify_dataset_integrity\n```\n* Train 3D UNet. For FOLD in [0, 1, 2, 3, 4], run:\n```bash\nnnUNet_train 3d_fullres nnUNetTrainerV2  TASK_name FOLD --npz\n```\n`TASK_name` denotes the task_id of six sub-organs.\n\n### Inference\nTo get the result of six model, run the following command respectively:\n```bash\nnnUNet_predict -i INPUT_FOLDER -o OUTPUT_FOLDER -t TASK_name -m 3d_fullres -f FOLD -chk model_best \n```\n### Postprocessing\nTo merge   Sagittal brain sinu and Transverse brain sinus, run:\n```bash\npython HMRNet/data_process/Organ3_two_parts_fusion.py\n```\nTo get the final segmentation result, you should pitch up the organ in succession. Run:\n```bash\npython HMRNet/data_process/Organs_fuse.py\n```\n(Attention:  you need to execute this command five times.)\n\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhilab-git%2Fabcs_2020","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fhilab-git%2Fabcs_2020","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhilab-git%2Fabcs_2020/lists"}