Skip to content

Commit

Permalink
Fix comment
Browse files Browse the repository at this point in the history
Signed-off-by: mzusman <[email protected]>
  • Loading branch information
mzusman committed Dec 10, 2024
1 parent c034ffe commit 75ff1ef
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions vllm/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -694,8 +694,7 @@ def get_num_layers_by_block_type(
block_type: LayerBlockType = LayerBlockType.attention,
) -> int:
# This function relies on 'layers_block_type' in hf_config,
# for hybrid/attention-free models w/o this attribute,
# we will need to have workarounds like so
# for w/o this attribute, we will need to have workarounds like so
attn_block_type = block_type == LayerBlockType.attention
is_transformer = not self.is_hybrid and not self.is_attention_free
start, end = self.get_layers_start_end_indices(parallel_config)
Expand Down

0 comments on commit 75ff1ef

Please sign in to comment.