This repository is the official PyTorch implementation of our work: M2FTrans: Modality-Masked Fusion Transformer for Incomplete Multi-Modality Brain Tumor Segmentation, presented at IEEE-JBHI 2024.
All our experiments are implemented based on the PyTorch framework with two 24G NVIDIA Geforce RTX 3090 GPUs, and we recommend installing the following package versions:
- python=3.8
- pytorch=1.12.1
- torchvision=0.13.1
Dependency packages can be installed using following command:
conda create --name m2ftrans python=3.8
conda activate m2ftrans
pip install -r requirements.txt
We provide two different versions of training framework, based on previous work RFNet and SMU-Net, corresponding to M2FTrans_v1 and M2FTrans_v2.
M2FTrans_v1
-
Download the preprocessed dataset (BraTS2020 or BraTS2018) from RFNet and unzip them in the
BraTS
folder .tar -xzf BRATS2020_Training_none_npy.tar.gz tar -xzf BRATS2018_Training_none_npy.tar.gz
-
If you want to preprocess by yourself, the preprocessing code
preprocess.py
is also provided, see RFNet for more details. -
For BraTS2021, download the train dataset from this link and extract it inside the
BraTS
folder, change the the path of src_path and tar_path inpreprocess_brats2021.py
, then run: -
python preprocess_brats2021.py
-
The train.txt, val.txt and test.txt of different datasets should be added in
BraTS20xx_Training_none_npy
folders, we also provide inBraTS/BraTS20xx_Training_none_npy
folders.
M2FTrans_v2
- Download the BraTS2018 train dataset from this link and extract it inside the
BraTS
folder.
The folder structure is assumed to be:
M2FTrans/
├── BraTS
│ ├── BRATS2018_Training_none_npy
│ │ ├── seg
│ │ ├── vol
│ │ ├── ...
│ ├── BRATS2020_Training_none_npy
│ │ ├── seg
│ │ ├── vol
│ │ ├── ...
│ ├── BRATS2021_Training_none_npy
│ │ ├── seg
│ │ ├── vol
│ │ ├── test.txt
│ │ ├── train.txt
│ │ ├── val.txt
│ ├── BRATS2021_Training_Data
│ │ ├── ...
│ ├── MICCAI_BraTS_2018_Data_Training
│ │ ├── HGG
│ │ ├── LGG
│ │ ├── ...
├── M2FTrans_v1
│ ├── ...
├── M2FTrans_v2
│ ├── ...
└── ...
M2FTrans_v1
-
Changing the paths and hyperparameters in
train.sh
,train.py
andpredict.py
. -
Set different splits for BraTS20xx in
train.py
. -
Then run:
bash train.sh
-
Noting that you may need more training epochs to get a better performance, you can also choose to load the pretrained model you trained or we provided by setting the resume path in
train.sh
.
M2FTrans_v2
-
Changing the paths and hyperparameters in
config.yml
,train.py
, andpredict.py
. -
Then run:
python train.py
Checking the relevant paths in path in eval.sh
or eval.py
.
M2FTrans_v1
bash eval.sh
M2FTrans_v2
python eval.py
- The pretrained models are also available in Google Drive.
The implementation is based on the repos: RFNet, mmFormer and SMU-Net, we'd like to express our gratitude to these open-source works.
Please consider citing this project in your publications if it helps your research. The following is a BibTeX reference. The BibTeX entry requires the url
LaTeX package:
@ARTICLE{10288381,
author={Shi, Junjie and Yu, Li and Cheng, Qimin and Yang, Xin and Cheng, Kwang-Ting and Yan, Zengqiang},
journal={IEEE Journal of Biomedical and Health Informatics},
title={MFTrans: Modality-Masked Fusion Transformer for Incomplete Multi-Modality Brain Tumor Segmentation},
year={2024},
volume={28},
number={1},
pages={379-390},
doi={10.1109/JBHI.2023.3326151}}