Skip to content

Commit

Permalink
Fix: Add missing enable_prompt_tokens_details flag for vllm (#786)
Browse files Browse the repository at this point in the history
  • Loading branch information
dleviminzi authored Dec 16, 2024
1 parent aab7c62 commit 835f131
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
2 changes: 1 addition & 1 deletion sdk/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "beta9"
version = "0.1.134"
version = "0.1.135"
description = ""
authors = ["beam.cloud <[email protected]>"]
packages = [
Expand Down
1 change: 1 addition & 0 deletions sdk/src/beta9/abstractions/integrations/vllm.py
Original file line number Diff line number Diff line change
Expand Up @@ -253,6 +253,7 @@ def __init__(
enable_auto_tool_choice=vllm_args.enable_auto_tool_choice,
tool_call_parser=vllm_args.tool_call_parser,
disable_log_stats=vllm_args.disable_log_stats,
enable_prompt_tokens_details=vllm_args.enable_prompt_tokens_details,
)

def __name__(self) -> str:
Expand Down

0 comments on commit 835f131

Please sign in to comment.