-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I want to add mamba_chat (2.8b) model #2001
Comments
+1 for this. Would be big to have vLLM support for Mamba type models. I expect there will be even bigger Mamba models out soon. |
FYI Jamba is now supported in vLLM (#4115), so it should be easier to implement this model now. |
+1 on Mamba models, Mistral just dropped a Mamba-based code model that looks awesome |
I think this should be solved by #6484 |
Mamba support just landed (still TODO to support Mamba2, so Codestral Mamba is not yet supported) |
Hello vllm Team,
havenhq/mamba-chat seems not supported with vllm. Will you be able to help me out to add it?
Sincerely,
The text was updated successfully, but these errors were encountered: