You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
loss = classify_loss,
then adding regression loss to the loss tensor, you will also modify classify_loss at the same time.
This is because the assign operation makes the loss and classify_loss share the same memory address. You should either do deep copy, or simply return classify_loss + torch.sum(torch.stack(regress_losses)) instead of creating a new loss tensor and return it
The text was updated successfully, but these errors were encountered:
When you do
This is because the assign operation makes the loss and classify_loss share the same memory address. You should either do deep copy, or simply return classify_loss + torch.sum(torch.stack(regress_losses)) instead of creating a new loss tensor and return it
The text was updated successfully, but these errors were encountered: