Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to use #2

Open
xiaoshuyun opened this issue Feb 23, 2021 · 3 comments
Open

how to use #2

xiaoshuyun opened this issue Feb 23, 2021 · 3 comments

Comments

@xiaoshuyun
Copy link

Thank you for your excellent work.
I want to know, how can I use your seesaw loss to replace cross entropy in the traditional detection algorithm?
Looking forward to your reply.

@bamps53
Copy link
Owner

bamps53 commented Feb 23, 2021

I believe you can use it by just replacing BCEWithLogitsLoss in your code with this variant.

class DistibutionAgnosticSeesawLossWithLogits(nn.Module):

But please note that it's now obsolete as original authors updated from 1st revision at ECCV 2020 workshop.
(It's mentioned in this thread)

@xiaoshuyun
Copy link
Author

Can BCEWithLogitsLoss be used for multi-class classification or target detection ?

@kuotunyu
Copy link

Can BCEWithLogitsLoss be used for multi-class classification or target detection ?

Did you solve it?
I have absolutely no idea how to use it

In multi-class classification, I changed nn.CrossEntropyLoss() to DistibutionAgnosticSeesawLossWithLogits or SeesawLossWithLogits, but it keeps getting IndexError: too many indices for tensor of dimension 1 ....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants