Skip to content

Commit

Permalink
[Doc] update gpu-memory-utilization flag docs (vllm-project#9507)
Browse files Browse the repository at this point in the history
Signed-off-by: Joe Runde <[email protected]>
Signed-off-by: Alvant <[email protected]>
  • Loading branch information
joerunde authored and Alvant committed Oct 26, 2024
1 parent 8a565e9 commit 04376dc
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion vllm/engine/arg_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -428,7 +428,11 @@ def add_cli_args(parser: FlexibleArgumentParser) -> FlexibleArgumentParser:
help='The fraction of GPU memory to be used for the model '
'executor, which can range from 0 to 1. For example, a value of '
'0.5 would imply 50%% GPU memory utilization. If unspecified, '
'will use the default value of 0.9.')
'will use the default value of 0.9. This is a global gpu memory '
'utilization limit, for example if 50%% of the gpu memory is '
'already used before vLLM starts and --gpu-memory-utilization is '
'set to 0.9, then only 40%% of the gpu memory will be allocated '
'to the model executor.')
parser.add_argument(
'--num-gpu-blocks-override',
type=int,
Expand Down

0 comments on commit 04376dc

Please sign in to comment.