Perplexity: mixtral-8x7b-instruct #9994
SilasFriby
started this conversation in
General
Replies: 1 comment
-
For now I have just made a custom class including the model. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear Llama Index,
would it be possible for you to extend the Perplexity LLM class, such that it also supports the model mixtral-8x7b-instruct?
Best regards
Silas
Beta Was this translation helpful? Give feedback.
All reactions