Skip to content

Code for ISBI 2023 paper "Self Pre-training with Masked Autoencoders for Medical Image Classification and Segmentation"

License

Notifications You must be signed in to change notification settings

cvlab-stonybrook/SelfMedMAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SelfMedMAE: Self Pre-training with Masked Autoencoders for Medical Image Analysis

Necessary Packages and Versions

pytorch=1.7.1=py3.8_cuda10.1.243_cudnn7.6.3_0
torchvision=0.8.2=py38_cu101
monai-weekly==0.9.dev2152
nibabel==3.2.1
omegaconf==2.1.1
timm==0.4.12

Preparation

  1. Install PyTorch, timm and MONAI.
  2. Download the BTCV and MSD_BraTS data.
  3. Install Wandb for logging and visualizations.

Stage 1: MAE Pre-Training

The run scripts are in directory scripts

python main.py \
        configs/mae3d_btcv_1gpu.yaml \
        --mask_ratio=0.125 \
        --run_name='mae3d_sincos_vit_base_btcv_mr125'

The default configurations are set in configs/mae3d_btcv_1gpu.yaml. You can overwrite the configurations by passing arguments with the corresponding key names through the command line, e.g., mask_ratio. We use Wandb to monitor the training process and visualize the masked reconstruction. During the training, the output including checkpoints and Wandb local files are all stored in the specified output_dir value in the configurations. The core MAE codes locate in lib/models/mae3d.py.

Stage 2: UNETR Fine-tuning

The run scripts is in directory scripts

python main.py \
        configs/unetr_btcv_1gpu.yaml \
        --lr=3.44e-2 \
        --batch_size=6 \
        --run_name=unetr3d_vit_base_btcv_lr3.44e-2_mr125_10ke_pretrain_5000e \
        --pretrain=$YOUR Pre-Trained MAE Checkpoint$

The core UNETR codes locate in lib/models/unetr3d.py.

About

Code for ISBI 2023 paper "Self Pre-training with Masked Autoencoders for Medical Image Classification and Segmentation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published