Replies: 1 comment 1 reply
-
This should already be supported on remote backends using the Mistral prompt format. Is there a specific llama.cpp version that added support for this model? I would assume it already works since Mistral models are generally the same architecture as Meta's Llama models. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
looks like the extension does not support this model currently. I keep getting error "Unexpected error during intent recognition"
I've tested this model out and it makes sense to support it specially because the context window is so large.
In my testing it's also a lot snappier than command-r
Beta Was this translation helpful? Give feedback.
All reactions