-
Notifications
You must be signed in to change notification settings - Fork 748
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix dropout = 1.0 issue. If dropout = 1.0, it should not run dropout … #202
base: master
Are you sure you want to change the base?
Conversation
Thx for you contribution. I see why this is better during training. But how should we control the dropout during validation and prediction? There we want to set the dropout to 1. |
For prediction, we don't need dropout. |
Right. So during training we want dropout to be < 1 and during validation it should be = 1. |
We can create two Unet with different keep_prob for training and validation. How do you think about it? |
Don't we have to train two models then? |
Hi @jakeret , any comment on the data. The data is based on CPU. |
An 16% performance improvement is nice. |
I am sorry for reply later. |
I don't see how this should be implemented. The computation-graph would be different for the two networks, which makes it hard to transfer the weights from one to the other |
There is no weight for the dropout layer, it is ok to save model in the train net, and restore them in the validation net. |
Python Dropout op uses the following code to check keep_prob value:
if tensor_util.constant_value(keep_prob) == 1: return x
If keep_prob is placeholder, tensor_util.constant_value(keep_prob) will return None, if statement will always be false.