Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问在up loss中关于background类别的处理。 #8

Open
Feobi1999 opened this issue May 11, 2022 · 3 comments
Open

请问在up loss中关于background类别的处理。 #8

Feobi1999 opened this issue May 11, 2022 · 3 comments

Comments

@Feobi1999
Copy link

        targets[:num_fg, self.num_classes-2] = gt_scores[:num_fg] * \
            (1-gt_scores[:num_fg]).pow(self.alpha)
        targets[num_fg:, self.num_classes-1] = gt_scores[num_fg:] * \
            (1-gt_scores[num_fg:]).pow(self.alpha)

        return self._soft_cross_entropy(mask_scores, targets.detach())

看到您在这块对于background 类好像采取了和unknown类一样的操作,可以解释一下这个的原因吗?论文中好像没有看到相关的阐述。

@csuhan
Copy link
Owner

csuhan commented May 12, 2022

image

We select the same number of background samples to 1) balance foreground and background samples, 2) recall unknown from background.

@libig1012
Copy link

您好,我想问一下这里的self.num_class是know_class+unknow_class+bg嘛
我看到代码分类训练是是self.cls_score = nn.Linear(
self.cls_score.in_features, self.num_classes + 1, bias=False)不太明白未知类的概率是怎么得到的。
感谢

@fuyimin96
Copy link

您好,我想问一下这里的self.num_class是know_class+unknow_class+bg嘛 我看到代码分类训练是是self.cls_score = nn.Linear( self.cls_score.in_features, self.num_classes + 1, bias=False)不太明白未知类的概率是怎么得到的。 感谢

同问

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants