-
-
Notifications
You must be signed in to change notification settings - Fork 197
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Scale factor calculation on Linux #878
Comments
Part of me thinks this is the result of an error that SDL is giving us but we aren't catching. If we look here: We can see that we dont check for an error like is documented here: https://wiki.libsdl.org/SDL_GetDisplayDPI I can't imagine why it would return values like the ones you posted if it were operating properly. I'll investigate this tonight, sorry you're having this issue! |
No problem, actually let me investigate that one :) |
@taylon haha be my guest! Let me know what you come up with! |
So I did some investigation and it is pretty complicated 😆 The values returned by SDL2 are indeed wrong, and that is because the value reported by xrandr itself is wrong (more precisely here SDL_x11modes.c). It seems to be a pretty well known issue though, the following links have more details: So at least in X11 it seems that the best way to obtain that information is through Xresources, since that is what most DEs use and also how glfw implements it: The best solution would probably be to implement something similar to glfw somewhere (potentially in SDL2?), although even then that would only cover X11, I have not investigated Wayland... Either way It is hard to say how many users would end up with a messed up scale factor like I did, but I don't think it is worth the risk. When users notice that the UI is too small they would look for zoom or something else in the documentation to make it bigger, but when they can't see anything it is a lot harder to know where to start looking for a solution. So my suggestion would be to default to 1.0 if GDK_SCALE is not set, at least until a more definitive solution gets implemented. I went ahead and created a PR #896 for that and another one revery-ui/reason-sdl2#78 for handling the error potentially returned from SDL. But let me know what you guys think :) |
Wow that is pretty complicated. Thanks for doing all that research, @taylon! I think your fix in #896 is reasonable. To me, it makes more sense to have a potentially low-scale display then one that is way too big, as is what's seen in your case. It might make sense to patch something into I'm curious to know what @bryphe thinks though. To me it seems that from a purely operational standpoint for programs like Oni, small rendering is better than none at all. |
Thanks for the detailed investigation, @taylon ! Indeed... a scale factor of
Agreed! This sounds like the right solution. I believe we had that inferred-DPI code back when we used GLFW, which may have worked correctly at that point (as it seems like GLFW handles the |
Hi!
I noticed an issue with the scaling factor calculation in Linux that causes a very high scale factor to be set, which in turn makes Oni2 as an example to render like this:
Looking at
revery/src/Core/Window.re
Lines 94 to 97 in 9cf3daa
I'm using a very unusual monitor resolution of 5120x1440 which might be an edge case that causes some weird bug, or it might be X returning it wrong for whatever reason.
Either way, what do you think would be a good mitigation for this?
We could say that if the resulting scaleFactor is higher than a given threshold then assume that the calculation is wrong and set it to 1.0?
The text was updated successfully, but these errors were encountered: