Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Call LossLayer::LayerSetUp in SmoothL1LossLayer. #8

Open
wants to merge 1 commit into
base: faster-rcnn
Choose a base branch
from

Conversation

erictzeng
Copy link
Contributor

If SmoothL1LossLayer doesn't have a loss_weight explicitly set, then it currently defaults to 0:

I0426 00:15:33.254120  1242 solver.cpp:228] Iteration 11360, loss = 0
I0426 00:15:33.254154  1242 solver.cpp:244]     Train net output #0: loss = 0.0142144

In particular, note the first line, in which the loss is 0 even though the output is nonzero.

This PR is a 1-line change that calls the parent LayerSetUp method, which does the work of setting the default loss_weight to 1.

(On a somewhat related note, in the py-faster-rcnn repository it seems to be the case that loss_weight is always explicitly set, so this bug was never an issue there.)

This gives the layer a loss_weight of 1 if no loss_weight is explicitly
specified (previously it was defaulting to 0).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant