Skip to content

Commit

Permalink
Sampling: Add max_completion_tokens
Browse files Browse the repository at this point in the history
Conforms with OAI's updated spec

Signed-off-by: kingbri <[email protected]>
  • Loading branch information
bdashore3 committed Dec 13, 2024
1 parent bc3c154 commit c23e406
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion common/sampling.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,9 @@ class BaseSamplerRequest(BaseModel):

max_tokens: Optional[int] = Field(
default_factory=lambda: get_default_sampler_value("max_tokens"),
validation_alias=AliasChoices("max_tokens", "max_length"),
validation_alias=AliasChoices(
"max_tokens", "max_completion_tokens", "max_length"
),
description="Aliases: max_length",
examples=[150],
ge=0,
Expand Down

0 comments on commit c23e406

Please sign in to comment.