Skip to content

Commit

Permalink
[Bugfix] Set temperature=0.7 in test_guided_choice_chat (vllm-project…
Browse files Browse the repository at this point in the history
  • Loading branch information
mgoin authored Dec 18, 2024
1 parent 2d1b9ba commit c77eb8a
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions tests/entrypoints/openai/test_chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -482,6 +482,7 @@ async def test_guided_choice_chat(client: openai.AsyncOpenAI,
model=MODEL_NAME,
messages=messages,
max_completion_tokens=10,
temperature=0.7,
extra_body=dict(guided_choice=sample_guided_choice,
guided_decoding_backend=guided_decoding_backend))
choice1 = chat_completion.choices[0].message.content
Expand All @@ -496,6 +497,7 @@ async def test_guided_choice_chat(client: openai.AsyncOpenAI,
model=MODEL_NAME,
messages=messages,
max_completion_tokens=10,
temperature=0.7,
extra_body=dict(guided_choice=sample_guided_choice,
guided_decoding_backend=guided_decoding_backend))
choice2 = chat_completion.choices[0].message.content
Expand Down

0 comments on commit c77eb8a

Please sign in to comment.