Skip to content

Commit

Permalink
fix: default proxyllm generator function (eosphoros-ai#971)
Browse files Browse the repository at this point in the history
  • Loading branch information
xtyuns authored and Hopshine committed Sep 10, 2024
1 parent bd47ed2 commit e2a5a81
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion dbgpt/model/llm_out/proxy_llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def proxyllm_generate_stream(
model_name = model_params.model_name
default_error_message = f"{model_name} LLM is not supported"
generator_function = generator_mapping.get(
model_name, lambda: default_error_message
model_name, lambda *args: [default_error_message]
)

yield from generator_function(model, tokenizer, params, device, context_len)

0 comments on commit e2a5a81

Please sign in to comment.