Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The categorical_cross_entropy in the Bayesian loss function is wrong. #6

Open
bzhong2 opened this issue Oct 25, 2018 · 3 comments
Open

Comments

@bzhong2
Copy link

bzhong2 commented Oct 25, 2018

The function definition is:
tf.keras.backend.categorical_crossentropy(
target,
output,
from_logits=False,
axis=-1
)

But the code on line 68 is:
undistorted_loss = K.categorical_crossentropy(pred, true, from_logits=True)

@sazya
Copy link

sazya commented Nov 2, 2018

It might not be the same issue by bzhong2, but...

In line 59 on bin/train.py, loss seems to be added logits_variance and softmax_output.

model.compile(
	optimizer=Adam(lr=1e-3, decay=0.001),
	loss={
	'logits_variance': bayesian_categorical_crossentropy(FLAGS.monte_carlo_simulations, num_classes),
	'softmax_output': 'categorical_crossentropy'
	},
	metrics={'softmax_output': metrics.categorical_accuracy},
	loss_weights={'logits_variance': .2, 'softmax_output': 1.})

But line 78 on bnn/loss_equations.py is

return variance_loss + undistorted_loss + variance_depressor

bayesian_categorical crossentropy includes undistorted_loss, which is the same as softmax_output.
Is this double count?

And it's relate bzhong2's issue? If we rewrite undistorted_loss as

undistorted_loss = K.categorical_crossentropy(pred, true, from_logits=True)
 -> undistorted_loss = K.categorical_crossentropy(true, pred, from_logits=True)

can we use only the logits_valiance in loss ?

@SivagopinathreddyVinta
Copy link

I am also having same issue

The function definition is:
tf.keras.backend.categorical_crossentropy(
target,
output,
from_logits=False,
axis=-1
)

But the code on line 68 is:
undistorted_loss = K.categorical_crossentropy(pred, true, from_logits=True)

@GKalliatakis
Copy link

GKalliatakis commented Jan 20, 2020

Hi @sazya have you found out if the bayesian_categorical_crossentropy makes a double count of the softmax_output?

Regarding the from_logits being set to True (taken from Keras doc) --> from_logits: Boolean, whether 'output' is the result of a softmax, or is a tensor of logits.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants