Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Aug 19, 2024
1 parent bfa1d53 commit 1ca1860
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion transformer_engine/pytorch/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -7201,7 +7201,10 @@ def forward(
# Query, Key, and Value
# ======================

fp8_mha = FP8GlobalStateManager.is_fp8_enabled() and FP8GlobalStateManager.get_fp8_recipe().fp8_mha
fp8_mha = (
FP8GlobalStateManager.is_fp8_enabled()
and FP8GlobalStateManager.get_fp8_recipe().fp8_mha
)

if self.attention_type == "self":
# Attention heads [sq, b, h] --> [sq, b, ng * (np/ng + 2) * hn]
Expand Down

0 comments on commit 1ca1860

Please sign in to comment.