Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Option to Specify Other LLM Providers in .env #10

Open
nkkko opened this issue Sep 25, 2024 · 7 comments · May be fixed by #43
Open

Add Option to Specify Other LLM Providers in .env #10

nkkko opened this issue Sep 25, 2024 · 7 comments · May be fixed by #43

Comments

@nkkko
Copy link
Member

nkkko commented Sep 25, 2024

Is your feature request related to a problem? Please describe.
The current implementation supports only Azure OpenAI as the LLM provider. It lacks the flexibility to support other LLM providers such as OpenAI directly, Anthropic, Google or Groq. This would open opportunities for easy future migration to more efficient provider.

Describe the solution you'd like

  1. Update the .env configuration file to support specifying alternative LLM providers.
  2. Modify the backend in main.py to read and initialize the appropriate LLM client based on the selected provider from the .env file.
  3. Implement helper functions to set up clients for the new LLM providers.

Additional context

  • Update .env to include environment variables for configuring alternative LLM providers.
  • Thoroughly test switching between different LLM providers to ensure smooth initialization and operation.
@nkkko
Copy link
Member Author

nkkko commented Oct 9, 2024

/bounty $20

Copy link

algora-pbc bot commented Oct 9, 2024

💎 $20 bounty • Daytona

Steps to solve:

  1. Start working: Comment /attempt #10 with your implementation plan
  2. Submit work: Create a pull request including /claim #10 in the PR body to claim the bounty
  3. Receive payment: 100% of the bounty is received 2-5 days post-reward. Make sure you are eligible for payouts

If no one is assigned to the issue, feel free to tackle it, without confirmation from us, after registering your attempt. In the event that multiple PRs are made from different people, we will generally accept those with the cleanest code.

Please respect others by working on PRs that you are allowed to submit attempts to.

e.g. If you reached the limit of active attempts, please wait for the ability to do so before submitting a new PR.

If you can not submit an attempt, you will not receive your payout.

Thank you for contributing to daytonaio/devcontainer-generator!

Add a bountyShare on socials

Attempt Started (GMT+0) Solution
🟢 @madman024 Oct 10, 2024, 7:16:37 PM WIP
🟢 @RS-labhub Dec 5, 2024, 12:25:19 PM #43

@madman024
Copy link
Contributor

madman024 commented Oct 10, 2024

/attempt #10

Hey there! 👋 Here's my plan for adding multi-LLM support to the project:
Adding Multiple LLM Providers - Implementation Plan
What will i do:

Create a simple but flexible system to swap between different AI providers (OpenAI, Anthropic, Groq, etc.)
Update our config to handle different provider settings
Refactor the existing Azure OpenAI code to use this new system
Add proper error handling for when providers act up

Main changes needed:

New provider system managing diff llms
Update to our .env setup
Tweaks to the database to track which provider was used
Some UI updates to show/select providers

What do you think? doe this seems right?

@nkkko
Copy link
Member Author

nkkko commented Oct 11, 2024

@madman024 sounds high level but sure you can try

@madman024
Copy link
Contributor

@nkkko ok sure i am working on it

@RS-labhub
Copy link

RS-labhub commented Dec 5, 2024

/attempt #10

Since it's been a long time and I see no movement, and neither is the issue assigned to someone, I'm opening a PR.

Algora profile Completed bounties Tech Active attempts Options
@RS-labhub 4 daytonaio bounties
JavaScript, Python,
Go & more
Cancel attempt

Copy link

algora-pbc bot commented Dec 5, 2024

💡 @RS-labhub submitted a pull request that claims the bounty. You can visit your bounty board to reward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants