We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi, thanks for your nice repo. You mention 2 3090: `
The following hardware is needed to run different models in MiniLLM:
`
So when I try the 60B Version with 2 RTX 3090 I get an OOM - how can I use both GPUs?
Kind regards,
Dirk
The text was updated successfully, but these errors were encountered:
Ah, sorry, I haven't implemented that feature yet :) happy to merge a PR if someone does that first
Sorry, something went wrong.
for what it's worth, i was running the 65B model on one a6000
No branches or pull requests
Hi,
thanks for your nice repo. You mention 2 3090:
`
The following hardware is needed to run different models in MiniLLM:
`
So when I try the 60B Version with 2 RTX 3090 I get an OOM - how can I use both GPUs?
Kind regards,
Dirk
The text was updated successfully, but these errors were encountered: