-
Notifications
You must be signed in to change notification settings - Fork 371
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Linux cuda crash #270
Comments
We have a new set of binaries ready to release in the next version (very soon). Could you try pulling the master branch and testing if it fixes this issue? |
Same error |
What CPU and GPU are you using? Edit: Original post shows |
CPU: I7-8700 |
It's quite weird because the only difference on compilation of native library since v0.6.0 is avx. We've published v0.8.0 just now, with a fix for cuda library (but not sure if it's helpful for this issue) and the cuda feature detection in #275. Could you please have a try with it? If it still doesn't work, you could add |
Oh, it’s normal |
Is the v0.8.0 normal or your self-compiled library normal?
Stopping logs from llama.cpp is an another issue. Currently you could try to redirect the outputs to other where. |
Update: this comment will allow you to capture the outputs from native library and output them anywhere else. |
v0.8.0
Thank you. |
I'm getting a crash using LLamaSharp.Backend.Cuda11
This problem occurs in 0.6.0 and 0.7.0, and is normal in 0.5.1
Env:
Ubuntu 22.04
The text was updated successfully, but these errors were encountered: