You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If dont fix pretrained VGG16s parameters , the model will increasing untill memory booms.
But if fix these parameters , it works.
Why?
code in train.py
for param in net.rpn.features.parameters():
param.requires_grad = False
The text was updated successfully, but these errors were encountered:
aRookieMan
changed the title
all the input array dimensions except for the concatenation axis must match exact
out of memory if don`t fix VGG16 param
Dec 7, 2018
If don
t fix pretrained VGG16
s parameters , the model will increasing untill memory booms.But if fix these parameters , it works.
Why?
code in train.py
The text was updated successfully, but these errors were encountered: