You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The make_mlp function appends also to the last layer batch normalization, dropout, and the activation function. This might be problematic, as the output of such function is used also to construct the classifier in the discriminator (self.real_classifier = make_mlp(...)). This might lead to have a final classifier which output is batch normalized and squashed using a (leaky)relu. Is this a desired behaviour?
The text was updated successfully, but these errors were encountered:
The
make_mlp
function appends also to the last layer batch normalization, dropout, and the activation function. This might be problematic, as the output of such function is used also to construct the classifier in the discriminator (self.real_classifier = make_mlp(...)
). This might lead to have a final classifier which output is batch normalized and squashed using a (leaky)relu. Is this a desired behaviour?The text was updated successfully, but these errors were encountered: