Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About K value when calculating loss #2

Open
VikingKang opened this issue Nov 6, 2020 · 1 comment
Open

About K value when calculating loss #2

VikingKang opened this issue Nov 6, 2020 · 1 comment

Comments

@VikingKang
Copy link

Thanks for your sharing.
I found that when you calculate the loss, the K you use seems to be based on batchsize, not a single image. Did you decide to do this based on relevant considerations? For example, there is little difference between batchsize and single image.
Maybe it is better if I calculate K separately for each picture?

for your paper:
L = Ldetection + αLclassification (1)
Detection loss Ldetection itself is a weighted sum of three components: mean binary crossentropy loss on positive pixels Lp, mean binary crossentropy loss on negative pixels Ln, and mean binary crossentropy loss on worst predicted k negative pixels Lh, where k is equal to the number of positive pixels in image.

@asmekal
Copy link
Owner

asmekal commented Nov 7, 2020

If we choose K among batch the average loss for this loss term will be higher than if we choose K for each image separately, so we will have stronger penalty on hard negatives. You can imagine that most of the images in batch have only 'easy negatives' which will not impact the loss significantly and therefore it might be better to pay more attention to other images in a batch. This was our motivation, so we haven't tried separate image negatives selection

Thanks for sharing the citation from the paper. Yes, we used batch-K selection, not a separate selection for each image, the description in the paper is incorrect here :(

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants