Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bugfix] Fix bias for 0-dim tensors in gemm #1246

Merged
merged 5 commits into from
Oct 17, 2024
Merged

Conversation

yaox12
Copy link
Collaborator

@yaox12 yaox12 commented Oct 12, 2024

Description

For te_gemm and te_grouped_gemm,

  • If grad == false, we should do nothing to bias.
  • If grad == true, we should calculate grad_bias here.

Type of change

  • Documentation change (change only to the documentation, either a fix or a new content)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Infra/Build change
  • Code refractor

Checklist:

  • I have read and followed the contributing guidelines
  • The functionality is complete
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

Comment on lines 20 to 21
// torch.sum is able to handle 0-dim tensors
if (bias.data_ptr() != nullptr && grad) bias.copy_(B.sum(0));
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should handle the case where B has no data:

Suggested change
// torch.sum is able to handle 0-dim tensors
if (bias.data_ptr() != nullptr && grad) bias.copy_(B.sum(0));
if (bias.data_ptr() != nullptr && grad) {
if (B.data_ptr() == nullptr) {
bias.zero_();
} else {
bias.copy_(B.sum(0))
}
}

It seems we are checking data_ptr() == nullptr in order to check for empty tensors. As an alternative, it may be better to make our intention explicit:

  if (A.numel() == 0 || B.numel() == 0) {
    if (D.numel() != 0 && !accumulate) D.zero_();
    // torch.sum is able to handle 0-dim tensors
    if (bias.numel() != 0 && grad) bias.copy_(B.sum(0));
    if (pre_gelu_out.numel() != 0) pre_gelu_out.zero_();
    return;
  }

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

@timmoon10 timmoon10 self-requested a review October 14, 2024 18:37
@yaox12
Copy link
Collaborator Author

yaox12 commented Oct 15, 2024

/te-ci pytorch

Copy link
Member

@ksivaman ksivaman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@yaox12
Copy link
Collaborator Author

yaox12 commented Oct 16, 2024

/te-ci pytorch

@yaox12
Copy link
Collaborator Author

yaox12 commented Oct 17, 2024

/te-ci pytorch

@yaox12
Copy link
Collaborator Author

yaox12 commented Oct 17, 2024

I'll merge this PR since the CI passes except for a doc test failing with irrelevant error.

@yaox12 yaox12 merged commit 8e97c8d into NVIDIA:main Oct 17, 2024
26 of 27 checks passed
@yaox12 yaox12 deleted the xiny/fix_bias branch October 17, 2024 14:48
ksivaman pushed a commit that referenced this pull request Oct 30, 2024
* fix bias for 0-dim tensor

Signed-off-by: Xin Yao <[email protected]>

* add check

Signed-off-by: Xin Yao <[email protected]>

* use numel() instead of nullptr

Signed-off-by: Xin Yao <[email protected]>

---------

Signed-off-by: Xin Yao <[email protected]>
timmoon10 pushed a commit to timmoon10/TransformerEngine that referenced this pull request Nov 7, 2024
* fix bias for 0-dim tensor

Signed-off-by: Xin Yao <[email protected]>

* add check

Signed-off-by: Xin Yao <[email protected]>

* use numel() instead of nullptr

Signed-off-by: Xin Yao <[email protected]>

---------

Signed-off-by: Xin Yao <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants