-
-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support tools
and tool_choice
parameter in OpenAI compatible service
#1869
Comments
I am confused, is this supported by vllm or not? |
+1 need function calling feature |
vllm openai 接口应该如何使用 tools 和 tool_choice 参数? |
Can someone help to confirm if tools/tools_choice is supported? this is not clear from the thread . I am using mistral 8 * 7B instruct model which supports function calling |
Not yet, tools support is coming with #3237 |
I really appreciate all the work being done in all attempts so far (#2488, #3237, #4656)! I've been waiting now already months for this... I'd like to make a suggestion for a three-step approach to add this feature into vLLM. It's a relatively big change and I think it would make things much easier if we would go step-by-step. Step 1 – support
|
@br3no Yes. Thank you for your suggestion and pushing this through. |
Also aliased as
functions
andfunction_call
in deprecated parameters.https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools
After #1756 is merged (thanks @Tostino!), it should be straightforward to add this as a core parameter to OpenAI compatible service. This will help unlock client libraries using similar interface. Do note that the underlying model need to support function calling (e.g. OpenHermes) and prompt engineering might be needed.
Also see @dongxiaolong's example here: #1756 (comment)
The text was updated successfully, but these errors were encountered: