Skip to content

Commit

Permalink
Update flashinfer.py
Browse files Browse the repository at this point in the history
  • Loading branch information
noamgat authored Jul 18, 2024
1 parent 96cc198 commit 1d7959c
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions vllm/attention/backends/flashinfer.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@
import torch

from vllm import _custom_ops as ops
from vllm.attention.backends.abstract import (AttentionBackend, AttentionImpl,
AttentionMetadata, AttentionType)
from vllm.attention.backends.abstract import (AttentionBackend,
AttentionImpl,
AttentionMetadata,
AttentionMetadataBuilder,
AttentionType)
Expand Down

0 comments on commit 1d7959c

Please sign in to comment.