-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The categorical_cross_entropy in the Bayesian loss function is wrong. #6
Comments
It might not be the same issue by bzhong2, but... In line 59 on bin/train.py,
But line 78 on bnn/loss_equations.py is
And it's relate bzhong2's issue? If we rewrite
can we use only the |
I am also having same issue The function definition is: But the code on line 68 is: |
Hi @sazya have you found out if the bayesian_categorical_crossentropy makes a double count of the softmax_output? Regarding the |
The function definition is:
tf.keras.backend.categorical_crossentropy(
target,
output,
from_logits=False,
axis=-1
)
But the code on line 68 is:
undistorted_loss = K.categorical_crossentropy(pred, true, from_logits=True)
The text was updated successfully, but these errors were encountered: