Code for Super-Resolution (SR) Benchmark Study involving BLASTNet 2.0 Data Written by W.T Chung and B. Akoush with citations for open-source code in individual files.
To install requirements:
pip install -r requirements.txt
This code works with Python 3.9. We used conda to manage environments in the work.
Data for this benchmark can be found at Kaggle.
Brief rundown on code:
- common contains code for math, data loading, and general utilities.
- metadata contains csv files with file ids that can be fed into the dataloaders for 5 different splits (train/test/val + 2 OOD sets)
- create_cubic_files.ipynb precomputes cubic interpolation on files for a baseline comparison.
- find_addport.py finds master port for multinode training.
- sample_lsf_{train,test,testcubic}.sh are batch submission scripts for IBM LSF (not Slurm).
- {train,test,testcubic}.py perform multinode training, and single-node evaluations for ML and cubic interpolation, respectively.
- requirements.txt provides recommended packages to run this code
To train the models, we provide a sample_lsf_train.sh that provides multi-node training via a batch submission on IBM LSF (not SLURM).
To eval the models, we provide a sample_lsf_test.sh that provides single-gpu evaluation via a batch submission on IBM LSF (not SLURM). Essentially it does:
python test.py \
--data_path=../diverse_2K_with_extrap/ \
--upscale=8 --timeit \
--approx_param=0.5M --case_name=./weights/seed42/rcan_approx0.5M_8xSR.pt \
--precision=32 --num_nodes=1 --model_type=rcan
If you want to compare the ML models with cubic interpolation. You can use create_cubic_files.ipynb to obtain interpolated data. And use sample_lsf_testcubic.sh, which does the same as:
python testcubic.py \
--data_path=../diverse_2K_with_extrap/ \
--cubic_path=../cubic/ \
--upscale=8 --timeit \
--batch_size=2 \
--precision=32 --num_nodes=1
We also provided pre-trained weights in the same Kaggle repo. A sample is provided in the weights folder.
You can learn how to plot the features and labels, and perform inference with the pre-trained weights in this Kaggle notebook!