Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About training loss, Where is the Gradient Penalty? #478

Open
QinjieXiao opened this issue Dec 14, 2024 · 0 comments
Open

About training loss, Where is the Gradient Penalty? #478

QinjieXiao opened this issue Dec 14, 2024 · 0 comments

Comments

@QinjieXiao
Copy link

QinjieXiao commented Dec 14, 2024

Thanks for releasing the excellent work. I have some questions about the training loss.
Where is the Gradient Penalty? I found some logs about "Gradient Penalty", e.g., D_GP in the released ckpt, but I cannot find code in the projected_model.py.
Also, a found a gp implementation in fs_model.py, but it was not use for training.

@QinjieXiao QinjieXiao changed the title About training loss Why loss_Gmain use "(-gen_logits).mean()" instead of Hinge loss e.g., "(-F.relu(gen_logits)).mean()" ? About training loss Why loss_Gmain use "(-gen_logits).mean()" instead of Hinge loss e.g., "(-F.relu(gen_logits)).mean()" ? Where is the Gradient Penalty? Dec 14, 2024
@QinjieXiao QinjieXiao changed the title About training loss Why loss_Gmain use "(-gen_logits).mean()" instead of Hinge loss e.g., "(-F.relu(gen_logits)).mean()" ? Where is the Gradient Penalty? About training loss, why loss_Gmain use "(-gen_logits).mean()" instead of Hinge loss e.g., "(-F.relu(gen_logits)).mean()" ? Where is the Gradient Penalty? Dec 14, 2024
@QinjieXiao QinjieXiao changed the title About training loss, why loss_Gmain use "(-gen_logits).mean()" instead of Hinge loss e.g., "(-F.relu(gen_logits)).mean()" ? Where is the Gradient Penalty? About training loss, Where is the Gradient Penalty? Dec 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant