Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to load model: Model type mllama not supported #208

Open
chronick opened this issue Nov 18, 2024 · 2 comments
Open

Failed to load model: Model type mllama not supported #208

chronick opened this issue Nov 18, 2024 · 2 comments

Comments

@chronick
Copy link

chronick commented Nov 18, 2024

Attempting to load vision model with mllama https://huggingface.co/mlx-community/Llama-3.2-11B-Vision-Instruct-8bit
MLX architecture

Is this meant to be supported on my machine/version? I can't seem to find any docs about it.

🥲 Failed to load the model

Failed to load model

Error when loading model: ValueError: Model type mllama not supported.

MacOS 15.1, M4 Pro
LM Studio 0.3.5
Runtime: LM Studio MLX 0.0.14

I also have Metal Llama.cpp 1.2.0 but I don't believe it is being used.

@YorkieDev
Copy link

YorkieDev commented Nov 21, 2024

@chronick this model is not yet supported in the current stable version of LM Studio.
See: lmstudio-ai/mlx-engine#5 (comment)

nb: It's also not supported in llama.cpp so GGUF's of it won't load.

@skovvuri-nxt
Copy link

@YorkieDev , Thanks for the response.. Is there a plan to support this model or the likes of it such as deepseek-vl-vl2, mold or SmolVLM etc?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants