Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: support Mistral-Large-Instruct-2407 function calling #6778

Open
ybdesire opened this issue Jul 25, 2024 · 5 comments
Open

[Feature]: support Mistral-Large-Instruct-2407 function calling #6778

ybdesire opened this issue Jul 25, 2024 · 5 comments

Comments

@ybdesire
Copy link

🚀 The feature, motivation and pitch

  1. Mistral-Large-Instruct-2407 performance is great, even Tool Use & Function Calling : https://mistral.ai/news/mistral-large-2407/
  2. the feature requirement is(maybe) similar as : [Feature]: VLLM suport for function calling in Mistral-7B-Instruct-v0.3 #5156

Alternatives

No response

Additional context

No response

@BUJIDAOVS
Copy link

Same request! I really need vLLM to support function call API interfaces. In fact, many of the currently released LLMs already support function call capabilities. This is very important for downstream application development. Plz

@K-Mistele
Copy link
Contributor

if it uses the same function call format as Mistral 7B v0.3 it should be supported in #5649

@Sapessii
Copy link

Sapessii commented Aug 8, 2024

@K-Mistele first of all thank you and good job on what you're doing to implement FC on vllm.

I'm using a quantized version of mistral large 2407, don't know if the template is the same but you can check it on line 6176 here https://huggingface.co/ModelCloud/Mistral-Large-Instruct-2407-gptq-4bit/blob/main/tokenizer_config.json

Copy link

github-actions bot commented Nov 8, 2024

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!

@github-actions github-actions bot added the stale label Nov 8, 2024
@flefevre
Copy link

flefevre commented Nov 20, 2024

MistralAI highligth the compatibility between vllm and mistral-small https://huggingface.co/mistralai/Mistral-Small-Instruct-2409 but baddly there are several tickets on the fact Vllm do not implement "function calling" aka tools for Mistral-Small.
Could you confirm the statut of supporting function calling for mistral models such as small, large ?
And could you share the command to start vllm with tools enable?

@github-actions github-actions bot added unstale and removed stale labels Nov 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants