Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pre-trained weight loading in DA-Clip #55

Open
Trevor-Philips-cbd opened this issue May 29, 2024 · 5 comments
Open

Pre-trained weight loading in DA-Clip #55

Trevor-Philips-cbd opened this issue May 29, 2024 · 5 comments

Comments

@Trevor-Philips-cbd
Copy link

Hello, could you explain how the pre-trained model you loaded during the training of DA-Clip was obtained? Because I noticed that the model “laion2b_s34b_b79k” is used in your training command, but in your readme, you are asked to download the “daclip_ViT-B-32.pt” file. What is the difference between these two? According to the description in your paper, the training weights of Clip should be loaded here and remain unchanged throughout the training process. I don’t understand the difference between these two sets of model weights, and I hope you can explain it, thank you.

@Algolzw
Copy link
Owner

Algolzw commented May 31, 2024

Hi! We actually modified the original CLIP model (in the code) with a controller for image degradation. The controller is retrained and saved as the weight "daclip_ViT-B-32.pt". So you need to load our specific weight for degraded images.

@Trevor-Philips-cbd
Copy link
Author

Hi! We actually modified the original CLIP model (in the code) with a controller for image degradation. The controller is retrained and saved as the weight "daclip_ViT-B-32.pt". So you need to load our specific weight for degraded images.

Hello, if I want to retrain your controller, what should I do? I saw in your documentation that I need to load the “laion2b_s34b_b79k” weights, but I couldn’t find the relevant weights. Could you please provide a download link?

@Algolzw
Copy link
Owner

Algolzw commented Jun 1, 2024

The training instructions are in this readme file. Usually, the model weights can be automatically downloaded (or you can manually download it via Huggingface).

@Trevor-Philips-cbd
Copy link
Author

The training instructions are in this readme file. Usually, the model weights can be automatically downloaded (or you can manually download it via Huggingface).

Hello, I have loaded the weights for the clip according to the instructions, but there is an error saying that the weights do not match. I have followed the instructions in your project and openclip to download the weights of “laion2b_s34b_b79k” and load them into the model. I am not sure which step I got wrong, and I hope you can give me some guidance.

@Algolzw
Copy link
Owner

Algolzw commented Jun 4, 2024

Can you show your test code? Since we changed the openclip code for image degradation, you need to load the weight with our own da-clip code, as in the evaluate.py script.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants