Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ingest FP8 attn scales and use them in ROCm FlashAttention #338

Merged
merged 7 commits into from
Dec 20, 2024

Don't calculate KV scales dynamically if Q scale is included

0bd414a
Select commit
Loading
Failed to load commit list.
Merged

Ingest FP8 attn scales and use them in ROCm FlashAttention #338

Don't calculate KV scales dynamically if Q scale is included
0bd414a
Select commit
Loading
Failed to load commit list.