Make Llava to be configurable so that you can swap text model #8191
mergennachin
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
🚀 The feature, motivation and pitch
Current Llava uses Llama 2 7B as the pretrained text model. https://github.com/pytorch/executorch/blob/main/examples/models/llava/README.md
The latest quantized Llama 1B/3B are good in terms of accuracy and size. Let's make it swappable with these models.
Alternatives
No response
Additional context
No response
RFC (Optional)
No response
Beta Was this translation helpful? Give feedback.
All reactions