You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Instead of just open ai and claud ai support try to add LiteLLM( multi llm support foss solution )support to this project in such a way that we can add our local proxy server api endpoint which support either selfhosted open source llm or hosted open source like groq mistral/llama or proprietary LLM like Google Gemini and use that in this to generate prompt as per our need. I know performance might not be as good as of gpt 4 Still open source are capable to provide many times better solution than 3.5
The text was updated successfully, but these errors were encountered:
Instead of just open ai and claud ai support try to add LiteLLM( multi llm support foss solution )support to this project in such a way that we can add our local proxy server api endpoint which support either selfhosted open source llm or hosted open source like groq mistral/llama or proprietary LLM like Google Gemini and use that in this to generate prompt as per our need. I know performance might not be as good as of gpt 4 Still open source are capable to provide many times better solution than 3.5
The text was updated successfully, but these errors were encountered: