Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LiteLLM support so that we can use different LLM for prompt generation #33

Open
Greatz08 opened this issue Mar 23, 2024 · 0 comments

Comments

@Greatz08
Copy link

Instead of just open ai and claud ai support try to add LiteLLM( multi llm support foss solution )support to this project in such a way that we can add our local proxy server api endpoint which support either selfhosted open source llm or hosted open source like groq mistral/llama or proprietary LLM like Google Gemini and use that in this to generate prompt as per our need. I know performance might not be as good as of gpt 4 Still open source are capable to provide many times better solution than 3.5

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant