-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to connect an LLM that is not on the supported list #856
Comments
Hello @ferdyhong . Thank you for reaching out to us. Apologies for the delayed reply. I would like to understand the problem better. When you say you have a private LLM, do you mean that you have an in-house built LLM or is it something that is available publicly like OpenAI or Gemini? |
Hi, @gaya3-zipstack |
Oh, I see. So does this mean that only the connection/configuration parameters are different? |
Please check if this is useful and can be applied to your use-case Please check out the section - "Things to keep in mind" |
Hi, @gaya3-zipstack |
We are using a private LLM, such as OpenAI or Gemini, which is not supported by the Unstract.
To use this private LLM, various parameters like a jwt token, a subscription key, and an endpoint address are required.
We would like to test connecting this private LLM to the Unstract open-source version.
Could you advise if this is possible?
The text was updated successfully, but these errors were encountered: