Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NATTEN example #16

Merged
merged 1 commit into from
Aug 14, 2024
Merged

NATTEN example #16

merged 1 commit into from
Aug 14, 2024

Conversation

Birch-san
Copy link
Contributor

@Birch-san Birch-san commented Aug 14, 2024

See my parity test which demonstrates that the FlexAttention implementation is allclose equivalent to NATTEN and also to masked SDPA.

3x3 NATTEN kernel on 6x6 canvas
natten_c6x6_k3x3

… local attention it implements. Add NATTEN example to notebook and masks.
@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Aug 14, 2024
Copy link
Contributor

@drisspg drisspg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! thanks for contributing and fixing the previous example

@drisspg drisspg merged commit 5e0d1b8 into pytorch-labs:main Aug 14, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants