Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inquiry about KAdaptation Implementation for Different ViTs (e.g., DinoV2) #9

Open
MarioAvolio opened this issue Jan 14, 2024 · 1 comment

Comments

@MarioAvolio
Copy link

Hello,

I am currently working on a project that involves the application of the KAdaptation technique, as detailed in the paper "Parameter-efficient Model Adaptation for Vision Transformers" , to various Vision Transformer models. My focus is particularly on models like DinoV2 and other similar pretrained transformers.

I have been exploring your repository and found it extremely insightful for my research. However, I couldn't locate any specific references or implementations related to the application of KAdaptation on other types of Vision Transformers, like DinoV2.

Could you kindly inform me if there are any existing implementations of KAdaptation that are compatible with DinoV2 or other pretrained transformer models? Additionally, any guidance on adapting KAdaptation to these models would be greatly appreciated.

This information would be immensely beneficial for my ongoing project, and I believe it could also aid others in the community working with similar models and adaptations.

Thank you for your time and assistance.

Best regards,
Mario

@jkooy
Copy link
Collaborator

jkooy commented Jan 20, 2024

Hi,
I haven't tried DinoV2, but I believe it works on other types of ViT as long as it has a similar architecture. Dino mainly just changed the training objective. I actually tried Kadaptation for image generation recently, and the method also works. You can refer to the implementation here https://github.com/eric-ai-lab/PEViT/blob/master/vision_benchmark/evaluation/model.py, should work similarly for Dino. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants