-
Notifications
You must be signed in to change notification settings - Fork 987
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Xvnc crashes with SIGBUS on cross-GPU DRI usage #1772
Comments
More details available in this thread: https://lists.freedesktop.org/archives/mesa-dev/2024-June/226245.html |
A bug has been reported to the kernel: https://bugzilla.kernel.org/show_bug.cgi?id=218993 |
I observe a bus error when attempting to start a VMware virtual machine with 3D acceleration. VMware uses Vulkan, and the failure seems to occur at exactly the same place as the failure described in this issue. (The symptoms are identical when I start a VMware virtual machine with 3D acceleration vs. when I run |
It does appear to be the same issue. If I set |
(based on the implementation in TigerVNC 1.14 beta) - Synchronize pixels between DRI3 pixmaps and their corresponding GBM buffer objects on an as-needed basis, in response to specific X11 operations rather than on a schedule. - Implement the simpler DRI3 v1 interface rather than DRI3 v2. This avoids the need to implement the get_formats(), get_modifiers(), and get_drawable_modifiers() methods. - Use Pixman (which is SIMD-accelerated) to synchronize pixels. - Hook the DestroyPixmap() screen method to clean up a pixmap's corresponding GBM buffer object if there are no more references to the pixmap. - Hook the CloseScreen() screen method to clean up the GBM device and close the DRM render node. To do: - Synchronize only the pixels that have changed. Known issues: TigerVNC/tigervnc#1772
Is there any fix found for this issue yet? EDIT: I have an idea but I don't know how to do that. How can I run the server entirely on dGPU without using AMD driver? I have 2 GPUs: iGPU is AMD Radeon Vega and dGPU is RTX 3050. Can I run the server on RTX 3050 only? |
Please see the upstream bug reports linked above. But no, currently we haven't seen any update from them with a fix.
Yes, with the You could also see if you can completely disable the iGPU in UEFI, if it's not being used. |
My laptop cannot disable iGPU.
|
How do I set EDIT: I figured it out. I needed to use the user config file and add the parameter like this:
However, I have a weird issue with performance. The game runs just fine and |
Indeed. Nvidia's driver is incompatible with TigerVNC, so you're not getting the same acceleration as with other drivers. It seems their driver has some basic acceleration, because it seems to be faster than just pure CPU, but it's still way slower than what the GPU should be able to do. We can't do much about this until either Nvidia becomes more compatible with the open-source driver model, or documents their proprietary magic. |
My understanding from their driver devs is that their proprietary magic is based on DRI2, which allocates GPU buffers on the X server. DRI3 instead allocates GPU buffers in the X client, at the expense of GLX conformance. (Multiple processes cannot render to the same GLX drawable with DRI3, but fortunately few applications need to do that.) nVidia's drivers also make heavy use of their proprietary and undocumented I strongly suspect that the hack described in #1773 (setting |
Describe the bug
If I start Xvnc with
-renderNode
set to my integrated AMD GPU, and then start an application using my discrete Nvidia GPU, then Xvnc will crash with SIGBUS:To Reproduce
Steps to reproduce the behavior:
Xvnc -renderNode /dev/dri/renderD128 :2
(assumingrenderD128
is the AMD iGPU)DISPLAY=:2 vkcube --gpu-number 1
(assuming GPU 1 is the Nvidia dGPU)Expected behavior
vkcube renders perfectly normal on the Xvnc display.
Client (please complete the following information):
No client needed.
Server (please complete the following information):
Additional context
Also crashes with an Intel ARC discrete GPU instead of the Nvidia one.
Does not crash if Xvnc is started with the discrete GPU and the application uses the integrated GPU. Possible bug in AMD driver?
The text was updated successfully, but these errors were encountered: