-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Function calling vllm #8
Conversation
maxDavid40
commented
Jul 18, 2024
- Ensure vllm's function calling compatibility with happy_vllm.
- Add argparse arguments to facilitate API deployment with function calling.
- Create a FunctionTool class to support future functions for function calling and create two examples (Weather and Music)
- Add Depends to modify requests before POSTing to chat/completion.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add in the readme file some documentation about function calling functionnalities added
src/happy_vllm/core/resources.py
Outdated
|
||
# Load the tools | ||
update_tools(args=args) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Should we integrate tools arguments server side? Should we not keep them api side, as it is done in the roadmap of openai apis?
- If an additional tool is needed, the server needs to be re launched.
- If tools are defined on server side, some tools usage conflicts can appear on production during inference if several requests are launched while needing calling differents tools (conflict on tool naming for instance).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, the implementation doesn't match with happy_vllm ideas, so :
- Move to example directory to keep the implementation because be useful for someone to need deploy on local
- We could create a new route which could update the tool_choice and tools variables (avoid to use in production)
else: | ||
data_dict['tools'] = tools["tools"] | ||
if data_dict['top_logprobs'] == 0: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is it necessary to re assign a None value to "top_logprobs" variable?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ChatCompletionRequest have top_logprobs(int) and logprobs(bool) optional and a method(pydantic) to check if top_logprobs then logprobs should be True.
When I get the ChatCompletionRequest.data instantiate with top_logprobs=None and logprobs=None, we get top_logprobs=0 and logprobs=False and we have to rewritte top_logprobs with None to avoid method check error at the new ChatCompletionRequest's instaciation
examples/function_tools/README.md
Outdated
|
||
## To know | ||
|
||
With vllm 0.5.0 to 0.5.3.post1, tool_choice's option ```auto``` and ```required``` are not yet implemented. You can only use one function by deployement : |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it one function for each deployement or one function for each call ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for each deployement
TOOLS = ['weather'] | ||
``` | ||
|
||
## Called functions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we specify that this endpoints has to be called, only if happy_vllm server is deployed with pre defined tools?
|
||
Each attributes corresponding to [Openai api](https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools) | ||
|
||
After the class is created, you have to declare it in TOOLS_DICT and TOOLS global variables in ```routers.shcema.functional``` (add weather function tool for example). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
typo to correct : '''routers.schema.functional'''