Skip to content

Commit

Permalink
Merge pull request #1 from lvhan028/PR-2888
Browse files Browse the repository at this point in the history
Fix lint error
  • Loading branch information
Galaxy-Husky authored Dec 12, 2024
2 parents fdb13da + 71037a2 commit 5ac3cfb
Showing 1 changed file with 12 additions and 9 deletions.
21 changes: 12 additions & 9 deletions lmdeploy/serve/async_engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -223,9 +223,10 @@ def __call__(self,
"""Inference a batch of prompts.
Args:
prompts (List[str] | str | List[Dict] | List[List[Dict]]]): a batch of
prompts. It accepts: string prompt, a list of string prompts,
a chat history in OpenAI format or a list of chat history.
prompts (List[str] | str | List[Dict] | List[List[Dict]]]): a
batch of prompts. It accepts: string prompt, a list of string
prompts, a chat history in OpenAI format or a list of chat
history.
gen_config (GenerationConfig | None): a instance of
GenerationConfig. Default to None.
do_preprocess (bool): whether pre-process the messages. Default to
Expand Down Expand Up @@ -297,9 +298,10 @@ def batch_infer(self,
"""Inference a batch of prompts.
Args:
prompts (List[str] | str | List[Dict] | List[List[Dict]]]): a batch of
prompts. It accepts: string prompt, a list of string prompts,
a chat history in OpenAI format or a list of chat history.
prompts (List[str] | str | List[Dict] | List[List[Dict]]]): a
batch of prompts. It accepts: string prompt, a list of string
prompts, a chat history in OpenAI format or a list of chat
history.
gen_config (GenerationConfig | None): a instance of or a list of
GenerationConfig. Default to None.
do_preprocess (bool): whether pre-process the messages. Default to
Expand Down Expand Up @@ -374,9 +376,10 @@ def stream_infer(
"""Inference a batch of prompts with stream mode.
Args:
prompts (List[str] | str | List[Dict] | List[List[Dict]]]): a batch of
prompts. It accepts: string prompt, a list of string prompts,
a chat history in OpenAI format or a list of chat history.
prompts (List[str] | str | List[Dict] | List[List[Dict]]]):a
batch of prompts. It accepts: string prompt, a list of string
prompts, a chat history in OpenAI format or a list of chat
history.
gen_config (GenerationConfig | None): a instance of or a list of
GenerationConfig. Default to None.
do_preprocess (bool): whether pre-process the messages. Default to
Expand Down

0 comments on commit 5ac3cfb

Please sign in to comment.