Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PyTorch] Add support for padding mask in UnfusedDotProductAttention #1073

Merged
merged 32 commits into from
Aug 21, 2024

Conversation

cyanguwa
Copy link
Collaborator

@cyanguwa cyanguwa commented Aug 2, 2024

Description

This PR

  • adds support for padding, padding_causal, padding_causal_bottom_right masks in the PyTorch UnfusedDotProductAttention backend
  • adds support for padding, padding_causal, padding_causal_bottom_right masks with ALiBi in UnfusedDotProductAttention
  • adds documentation for attention mask support
  • improves the logic for selecting the fused and unfused softmax backends
  • converts causal/padding_causal to causal_bottom_right/padding_causal_bottom_right when InferenceParams is present in DotProductAttention.

Type of change

  • Documentation change (change only to the documentation, either a fix or a new content)
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Infra/Build change
  • Code refractor

Changes

Please list the changes introduced in this PR:

  • Adds support for padding masks in UnfusedDotProductAttention and improves padding related logic in attention

Checklist:

  • I have read and followed the contributing guidelines
  • The functionality is complete
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

@cyanguwa
Copy link
Collaborator Author

cyanguwa commented Aug 7, 2024

/te-ci pytorch

@cyanguwa
Copy link
Collaborator Author

cyanguwa commented Aug 7, 2024

/te-ci pytorch

@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

@cyanguwa cyanguwa requested a review from ptrendx August 15, 2024 00:34
@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

@ptrendx ptrendx added the 1.10.0 label Aug 16, 2024
@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

@cyanguwa
Copy link
Collaborator Author

/te-ci pytorch

@cyanguwa cyanguwa merged commit 467b39a into NVIDIA:main Aug 21, 2024
26 checks passed
BeingGod pushed a commit to BeingGod/TransformerEngine that referenced this pull request Aug 30, 2024
NVIDIA#1073)

* add support for padding in UnfusedDPA

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add support for padding_causal/_bottom_right

Signed-off-by: Charlene Yang <[email protected]>

* fix padding_causal/_bottom_right

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* need to test max512 backend

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* revert last commit

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix mask logic in unfused

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* use actual_seqlen for alibi/causal_bottom_right padding

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix lint

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* minor fixes and convert causal to causal_bottom_right for inference

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* use causal in kv cache inference test

Signed-off-by: Charlene Yang <[email protected]>

* simplify get_alibi logic

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* simplify the non-padding path for get_alibi

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* avoid batch_size loop in generating padding_causal/_bottom_right masks

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Charlene Yang <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Signed-off-by: beinggod <[email protected]>
ptrendx pushed a commit that referenced this pull request Aug 31, 2024
#1073)

* add support for padding in UnfusedDPA

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add support for padding_causal/_bottom_right

Signed-off-by: Charlene Yang <[email protected]>

* fix padding_causal/_bottom_right

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* need to test max512 backend

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* revert last commit

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix mask logic in unfused

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* use actual_seqlen for alibi/causal_bottom_right padding

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix lint

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* minor fixes and convert causal to causal_bottom_right for inference

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* use causal in kv cache inference test

Signed-off-by: Charlene Yang <[email protected]>

* simplify get_alibi logic

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* simplify the non-padding path for get_alibi

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* avoid batch_size loop in generating padding_causal/_bottom_right masks

Signed-off-by: Charlene Yang <[email protected]>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Charlene Yang <[email protected]>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants