Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run SegNAFNet_arch #7

Open
Nanncy99 opened this issue Jul 9, 2024 · 5 comments
Open

run SegNAFNet_arch #7

Nanncy99 opened this issue Jul 9, 2024 · 5 comments

Comments

@Nanncy99
Copy link

Nanncy99 commented Jul 9, 2024

image
What should I do if the forword function lacks masks when calculating model parameters?Hope to get your answer, thank you very much!

@HPLQAQ
Copy link
Owner

HPLQAQ commented Jul 9, 2024

Hello, our method needs to get masks calculated from SAM model in advance.
To run the codes, the dataset that we provided is needed.

Download prepared data for experiments from Baidu Netdisk|Onedrive.
If you only want to run our deblur pipeline, download test/val datasets(GoPro, RealBlurJ, REDS, ReLoBlur provided). if you want to train the model yourself, download train datasets(GoPro provided). Check datasets README for standard dataset structure.
Unzip data and put under datasets dir for experiments.

The mask generation codes are in

scripts/data_preparation/1_create_masks_with_sam.py: Create masks for concat method.
scripts/data_preparation/2_masks_to_grouped_masks.py: Create grouped_masks from masks for our MAP method.

I'll try integrating sam into the inference code later to avoid the necessity of downloading our data when I'm available.

  • masks (torch.Tensor): A tensor of shape (M, 1, H, W) representing M masks.
    You can get the masks following the mask generation methods decribed in Zheyan Jin, Shiqi Chen, Yueting Chen, Zhihai Xu, and
    Huajun Feng, “Let segment anything help image dehaze,”
    arXiv preprint arXiv:2306.15870, 2023. or have a look at our codes in scripts/data_preparation/1_create_masks_with_sam.py.
    Only to get the code running, you can pass torch.ones(1, 1, H, W) to the function.

@Nanncy99
Copy link
Author

Thank you very much for your answer, it has been very useful to me.However, I have one more small question.Where is MAP Unit reflected in the code in the paper? I don't seem to have found it yet.I hope to receive your reply. Thank you very much again.

@HPLQAQ
Copy link
Owner

HPLQAQ commented Jul 10, 2024

# Compute the area of each mask (sum over H, W dimensions)

From line 104 to line 110, we calculated the area and the sum_val for each mask to get the avg_val.

From line 112 to line 115, they were multiplied to the masks and added together.

if you remove the sum operation at line 115, then from line 104 to 115 reflects a MAP Unit.

@Nanncy99
Copy link
Author

Thank you for your answer.I have a small question regarding scripts/data_preparation/2_masks_to_grouped_masks.py: Create grouped_masks from masks for our MAP method.
The code uses an lmdb format dataset. If my dataset does not have an lmdb format, do I need to generate an lmdb format?Hope to get your answer, thank you very much!

@HPLQAQ
Copy link
Owner

HPLQAQ commented Jul 11, 2024

You can directly change the 'read_image_from_lmdb' function to 'read_image_from_folder' function.

Change the following codes to simply go through the data folder and pass the data path to 'read_image_from_folder' should work.

env = lmdb.open(lmdb_dir, readonly=True, max_dbs=0)

with env.begin() as txn:
    cursor = txn.cursor()
    for key, _ in cursor:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants