Skip to content
/ EBSR Public

Pytorch code for "EBSR: Feature Enhanced Burst Super-Resolution with Deformable Alignment", CVPRW 2021, 1st NTIRE (real data track).

Notifications You must be signed in to change notification settings

Algolzw/EBSR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

0ae35fd Β· Jul 10, 2022

History

29 Commits
Jun 24, 2021
Feb 12, 2022
Dec 18, 2021
Jun 24, 2021
Oct 17, 2021
Feb 12, 2022
Jun 24, 2021
Feb 12, 2022
Dec 18, 2021
Jul 10, 2022
Oct 17, 2021
Oct 30, 2021
Jun 24, 2021
Oct 30, 2021
Feb 12, 2022
Oct 30, 2021
Jun 24, 2021

Repository files navigation

EBSR: Feature Enhanced Burst Super-Resolution With Deformable Alignment (CVPRW 2021)

Update !!!

  • 2022.04.22 πŸŽ‰πŸŽ‰πŸŽ‰ We won the 1st place in NTIRE 2022 BurstSR Challenge again [Paper][Code].
  • 2022.01.22 We updated the code to support real track testing and provided the model weights here
  • 2021 Now we support 1 GPU training and provide the pretrained model here.

This repository is an official PyTorch implementation of the paper "EBSR: Feature Enhanced Burst Super-Resolution With Deformable Alignment" from CVPRW 2021, 1st NTIRE21 Burst SR in real track (2nd in synthetic track).

Dependencies

  • OS: Ubuntu 18.04
  • Python: Python 3.7
  • nvidia :
    • cuda: 10.1
    • cudnn: 7.6.1
  • Other reference requirements

Quick Start

1.Create a conda virtual environment and activate it

conda create -n pytorch_1.6 python=3.7
source activate pytorch_1.6

2.Install PyTorch and torchvision following the official instructions

conda install pytorch==1.6.0 torchvision==0.7.0 cudatoolkit=10.1 -c pytorch

3.Install build requirements

pip3 install -r requirements.txt

4.Install apex to use DistributedDataParallel following the Nvidia apex (optional)

git clone https://github.com/NVIDIA/apex
cd apex
pip install -v --disable-pip-version-check --no-cache-dir --global-option="--cpp_ext" --global-option="--cuda_ext" ./

5.Install DCN

cd DCNv2-pytorch_1.6
python3 setup.py build develop # build
python3 test.py # run examples and check

Training

# Modify the root path of training dataset and model etc.
# The number of GPUs should be more than 1
python main.py --n_GPUs 4 --lr 0.0002 --decay 200-400 --save ebsr --model EBSR --fp16 --lrcn --non_local --n_feats 128 --n_resblocks 8 --n_resgroups 5 --batch_size 16 --burst_size 14 --patch_size 256 --scale 4 --loss 1*L1

Test

# Modify the path of test dataset and the path of the trained model
python test.py --root /data/dataset/ntire21/burstsr/synthetic/syn_burst_val --model EBSR --lrcn --non_local --n_feats 128 --n_resblocks 8 --n_resgroups 5 --burst_size 14 --scale 4 --pre_train ./checkpoints/EBSRbest_epoch.pth

or test on the validation dataset:

python main.py --n_GPUs 1 --test_only --model EBSR --lrcn --non_local --n_feats 128 --n_resblocks 8 --n_resgroups 5 --burst_size 14 --scale 4 --pre_train ./checkpoints/EBSRbest_epoch.pth

Real track evaluation

You may need to download pretrained PWC model to the pwcnet directory (here).

python test_real.py --n_GPUs 1 --model EBSR --lrcn --non_local --n_feats 128 --n_resblocks 8 --n_resgroups 5 --burst_size 14 --scale 4 --pre_train ./checkpoints/BBSR_realbest_epoch.pth --root burstsr_validation_dataset...

Citations

If EBSR helps your research or work, please consider citing EBSR. The following is a BibTeX reference.

@InProceedings{Luo_2021_CVPR,
    author    = {Luo, Ziwei and Yu, Lei and Mo, Xuan and Li, Youwei and Jia, Lanpeng and Fan, Haoqiang and Sun, Jian and Liu, Shuaicheng},
    title     = {EBSR: Feature Enhanced Burst Super-Resolution With Deformable Alignment},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
    month     = {June},
    year      = {2021},
    pages     = {471-478}
}

Contact

email: [[email protected], [email protected]]

About

Pytorch code for "EBSR: Feature Enhanced Burst Super-Resolution with Deformable Alignment", CVPRW 2021, 1st NTIRE (real data track).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published