-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added Groq Support #1238
base: main
Are you sure you want to change the base?
Added Groq Support #1238
Conversation
… no errors. Though final answers are halucinated rather than actual output. Seems to plan, write code, but not execute.
…TING.md + minor cleanup
@KillianLucas please open a branch as it is now, the errors are comming from litellm's side, Thanks a lot and all the best! 😇 |
Current state of PR:
Todos:
|
@@ -26,6 +31,7 @@ def __init__(self, interpreter): | |||
|
|||
# Settings | |||
self.model = "gpt-4-turbo" | |||
self.model = "groq/mixtral-8x7b-32768" # can now use models from groq. `export GROQ_API_KEY="your-key-here")` or use --model |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line should be deleted before merging to main
branch
Damn I came to add this fork, beat me by an hour! Nice going! |
haha thanks @Cobular , as stated this pr doest make code execute yet, just swaps the completion apis correctly, One important thing.... there is a way to make it work Now just not with
|
BerriAI/litellm#3176 |
NICE. Love Groq, great work on this @fire17. As @CyanideByte mentioned I think we should push this into LiteLLM (they abstract away the Groq interaction so it's = to an OpenAI client.) And it looks like it works with the latest LiteLLM! In that case, it would be great if we could merge this PR with just the documentation then. I'll make that change then merge if that's okay with you. If there's anything else to include from the PR let me know, we can reopen. |
For sure! @KillianLucas All the best! Ps checkout my new PR #1259 |
When will the merger happen? |
@fire17 Would you like to update this PR or prefer that I close it? |
Describe the changes you have made:
Groq's official python api now fits well into oi flow, no errors.
Though final answers are halucinated rather than actual output.
Seems to plan, write code, but not execute yet.
Reference any relevant issues:
Trying to get groq/mixtral to work #1237
aka groq is not working with litellm out of the box
--model groq/mixtral-8x7b-32768
throws errorsPre-Submission Checklist (optional but appreciated):
docs/CONTRIBUTING.md
docs/ROADMAP.md
(not fully yet, but no mention of groq)OS Tests (optional but appreciated):