No options to use it with Open Sourced models? #272
venturaEffect
started this conversation in
Ideas
Replies: 1 comment
-
It should be possible to run your own llm backend and use any OpenAI compatible endpoint. Probably tweak with mindsearch/agent/models.py, and set your backend url at backend.py. Try it out! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
The project is awesome. But I think it fails in not providing an option to use Open Source LLMs.
The implementation if doing so, would be massive.
But, maybe I'm wrong and haven't seen that it is possible.
Anyway, congrats on the project!
Beta Was this translation helpful? Give feedback.
All reactions