You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just started using RT-DETR.
The checkpoint of rtdetrv2_r18vd_120e_coco_rerun_48.1.pth
has around 77 MB, but after running this command, python tools/train.py -c configs/rtdetrv2/rtdetrv2_r18vd_120e_coco.yml--use-amp --seed=0
The checkpoints for each epoch are around 300MB
Shouldn't the checkpoints and the pretrained models have the same size?
The text was updated successfully, but these errors were encountered:
I just started using RT-DETR.
The checkpoint of
rtdetrv2_r18vd_120e_coco_rerun_48.1.pth
has around 77 MB, but after running this command,
python tools/train.py -c configs/rtdetrv2/rtdetrv2_r18vd_120e_coco.yml--use-amp --seed=0
The checkpoints for each epoch are around 300MB
Shouldn't the checkpoints and the pretrained models have the same size?
The text was updated successfully, but these errors were encountered: