Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finetuning model #268

Open
Srivatsa1648 opened this issue Sep 25, 2024 · 0 comments
Open

Finetuning model #268

Srivatsa1648 opened this issue Sep 25, 2024 · 0 comments

Comments

@Srivatsa1648
Copy link

bash scripts/DINO_train.sh /home/slam/Downloads/REU_Srivatsa/radar_img --pretrain_model_path /home/slam/Downloads/REU_Srivatsa/checkpoint0011_4scale.pth --finetune_ignore label_enc.weight class_embed
when i give the above command and modify the config file as per the class in my dataset i am facing the below error of torch size mismatch
RuntimeError: Error(s) in loading state_dict for DINO:
size mismatch for transformer.decoder.class_embed.0.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for transformer.decoder.class_embed.0.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for transformer.decoder.class_embed.1.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for transformer.decoder.class_embed.1.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for transformer.decoder.class_embed.2.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for transformer.decoder.class_embed.2.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for transformer.decoder.class_embed.3.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for transformer.decoder.class_embed.3.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for transformer.decoder.class_embed.4.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for transformer.decoder.class_embed.4.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for transformer.decoder.class_embed.5.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for transformer.decoder.class_embed.5.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for transformer.enc_out_class_embed.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for transformer.enc_out_class_embed.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for label_enc.weight: copying a param with shape torch.Size([92, 256]) from checkpoint, the shape in current model is torch.Size([15, 256]).
size mismatch for class_embed.0.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for class_embed.0.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for class_embed.1.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for class_embed.1.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for class_embed.2.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for class_embed.2.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for class_embed.3.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for class_embed.3.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for class_embed.4.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for class_embed.4.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
size mismatch for class_embed.5.weight: copying a param with shape torch.Size([91, 256]) from checkpoint, the shape in current model is torch.Size([11, 256]).
size mismatch for class_embed.5.bias: copying a param with shape torch.Size([91]) from checkpoint, the shape in current model is torch.Size([11]).
and if i finetune without changing config file i am getting very least ap of 0.26 and i am training for dino_4scale model
pls reslove my error asap

@Srivatsa1648 Srivatsa1648 changed the title checkpopints not working Finetuni9ng model Sep 25, 2024
@Srivatsa1648 Srivatsa1648 changed the title Finetuni9ng model Finetuning model Sep 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant