Skip to content

Issues: thu-ml/SageAttention

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Circular Import when using sageattn
#126 opened Mar 3, 2025 by MercuryB1
AMD GPU ROCm is supported
#125 opened Mar 2, 2025 by githust66
fluxpipeline speed 10%?
#121 opened Feb 27, 2025 by algorithmconquer
Unaligned LSE results enhancement New feature or request
#120 opened Feb 25, 2025 by XinzeLi
Need hunyuanvideo support
#115 opened Feb 21, 2025 by zhiwei-dong
sliding window attention enhancement New feature or request
#114 opened Feb 20, 2025 by matthijsvk
sageattention in hunyuanvideo meet attention_mask duplicate This issue or pull request already exists
#99 opened Feb 6, 2025 by Andy0422
Alibi or window Attention Support
#92 opened Jan 21, 2025 by asahni04
Compilation with full CUDA graphs (without breaks) enhancement New feature or request
#74 opened Dec 20, 2024 by bm-synth
Possibilities of support Pascal enhancement New feature or request wontfix This will not be worked on
#66 opened Dec 7, 2024 by sorasoras
SageAttention support for GPU T4? enhancement New feature or request wontfix This will not be worked on
#63 opened Dec 6, 2024 by ivankxt
ProTip! Type g i on any issue or pull request to go back to the issue listing page.