Skip to content

Commit

Permalink
fix v1 flash attn get_name
Browse files Browse the repository at this point in the history
Signed-off-by: Isotr0py <[email protected]>
  • Loading branch information
Isotr0py committed Nov 23, 2024
1 parent 2f9fad5 commit f6cd682
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/v1/attention/backends/flash_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ def get_supported_head_sizes() -> List[int]:

@staticmethod
def get_name() -> str:
return "flash-attn-vllm-v1"
return "FLASH_ATTN_VLLM_V1"

@staticmethod
def get_impl_cls() -> Type["FlashAttentionImpl"]:
Expand Down

0 comments on commit f6cd682

Please sign in to comment.