-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Option to Specify Other LLM Providers in .env #10
Comments
/bounty $20 |
💎 $20 bounty • DaytonaSteps to solve:
If no one is assigned to the issue, feel free to tackle it, without confirmation from us, after registering your attempt. In the event that multiple PRs are made from different people, we will generally accept those with the cleanest code. Please respect others by working on PRs that you are allowed to submit attempts to. e.g. If you reached the limit of active attempts, please wait for the ability to do so before submitting a new PR. If you can not submit an attempt, you will not receive your payout. Thank you for contributing to daytonaio/devcontainer-generator! Add a bounty • Share on socials
|
/attempt #10 Hey there! 👋 Here's my plan for adding multi-LLM support to the project: Create a simple but flexible system to swap between different AI providers (OpenAI, Anthropic, Groq, etc.) Main changes needed: New provider system managing diff llms What do you think? doe this seems right? Options |
@madman024 sounds high level but sure you can try |
@nkkko ok sure i am working on it |
/attempt #10 Since it's been a long time and I see no movement, and neither is the issue assigned to someone, I'm opening a PR.
|
💡 @RS-labhub submitted a pull request that claims the bounty. You can visit your bounty board to reward. |
Is your feature request related to a problem? Please describe.
The current implementation supports only Azure OpenAI as the LLM provider. It lacks the flexibility to support other LLM providers such as OpenAI directly, Anthropic, Google or Groq. This would open opportunities for easy future migration to more efficient provider.
Describe the solution you'd like
.env
configuration file to support specifying alternative LLM providers.main.py
to read and initialize the appropriate LLM client based on the selected provider from the.env
file.Additional context
.env
to include environment variables for configuring alternative LLM providers.The text was updated successfully, but these errors were encountered: