You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to run inference of a mistral 8X7B model.I want to utilize function calling / tools while inferencing using open ai compatible API endpoints but I am not able to get details around it. is this supported? if not what is the timeline of having this.
The text was updated successfully, but these errors were encountered:
Your current environment
How would you like to use vllm
I want to run inference of a mistral 8X7B model.I want to utilize function calling / tools while inferencing using open ai compatible API endpoints but I am not able to get details around it. is this supported? if not what is the timeline of having this.
The text was updated successfully, but these errors were encountered: