You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to reproduce FCN paper results using Alex-Net (imagenet-matconvnet-alex.mat) which
doesn't contain Local response normalization layer as in Alex-Net (imagene-caffe-ref.mat).
This the results (shown in the pdf attached) i obtained using :
minibatch size = 1 ;
learning rate = 1e-5 ;
after 100 epoch i got meanIU = 25 on validation dataset (1111 images) and it seems that there is some sort of saturation in meanIU value for validation dataset , so what i should do next !
1-train for more epochs ?
2- decrease the learning rate and train from the start ?
3- add local response normalization layer like the paper ?
I'm trying to reproduce FCN paper results using Alex-Net (imagenet-matconvnet-alex.mat) which
doesn't contain Local response normalization layer as in Alex-Net (imagene-caffe-ref.mat).
This the results (shown in the pdf attached) i obtained using :
minibatch size = 1 ;
learning rate = 1e-5 ;
after 100 epoch i got meanIU = 25 on validation dataset (1111 images) and it seems that there is some sort of saturation in meanIU value for validation dataset , so what i should do next !
1-train for more epochs ?
2- decrease the learning rate and train from the start ?
3- add local response normalization layer like the paper ?
net-train.pdf
The text was updated successfully, but these errors were encountered: