-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
screen.colorDepth
returns 30
on a standard 32-bit monitor.
#29
Comments
On Sunday January 26 2025 18:49:21 Jonathan wrote:
P.S. I am in fact quite interested in whether Mavericks could drive an HDR display, the HDR wouldn't work but I think 10 bit color might? Something to try some day...
Since you hijack your own thread: I'm dubious that you'd notice a lot of difference in every day life. IIUC a 10-bit depth will give you 1024 levels of pure gray instead of just 256, but you need content that will actually make use of those extra levels. I wouldn't be surprised if most code and codecs assume that a pixel has 3 8-bit colour components plus an optional 8-bit transparency component.
|
i'd find out whether this value means what you think it does. official build on crapalina also returns 30 on my i9980 macbook 16 |
That checks out, the 2019 16" Macbook Pro supports "Wide color (P3)" according to https://support.apple.com/en-us/111932. Mind, I'm still not 100% confident I understand the situation here, just that what you said matches my current understanding. A display that supports P3 wide color like the 2019 MBP should return 30, and one that supports boring old Rec. 709 should return 24. |
Additionally, To be clear, as far as I'm concerned this should be the lowest of low priority items, but I think it's good to know about in case we eventually find a website that breaks because of it. |
it's this commit that's causing the problem: https://hg.mozilla.org/mozilla-central/rev/1b57e65d4d7ec195e6373e731112d8b5539ba86a why? i don't know. but i moved the old lines back and now it would report 24 in my VM |
Cool! But... now does your 2019 MBP, which as far as we know should report 30, still report 30? |
i don't think so, because my imac pro is reporting 24 as well if i revert the change |
i dunno why that nswindowdepth loop is going off the deep end though. and without it, i don't get 30 bit colour. kind of strange. |
Mine also says 32 bit color. But apparently (again, from what I understand) this actually should lead to a |
i don't think this is an easy thing for APPUL machines. on brojave, it seems there is no Pixel Depth property and instead it became "framebuffer depth": ![]() and this guy's answer: https://stackoverflow.com/questions/36902312/detecting-if-a-display-supports-30-bit-color won't work here because we don't have a window object for this screen. we simply want to run through the screens and find the best depth. https://forums.developer.apple.com/forums/thread/24926 the useless APPUL developers don't have an answer, either. as far as i'm concerned, i highly doubt this will be a problem. but leave it open and maybe you'll get a bite. |
at the time the Big Brad Werth (like big bad wolf, get it @bradwerth? ha) made this commit, 10.12 was the lowest supported OS. so it probably works right on newer OSes. problem is sub 10.11 systems report the colour depth as opposed to the framebuffer depth i think. |
I'm curious if @krackers has any thoughts. The other question is, if we can't feasibly return the correct value based on the color depth, should Firefox return 30 or 24? The latter is more likely to be correct, although that doesn't necessarily mean it's the right choice. |
it should be 24, and it would be 24 on an APPUL os newer than or equal to 10.11. unfortunately i'm too lazy to confirm this by opening my laptop since i'd need to sync my mercurial and rebuild the source, which i haven't done since i've returned to my main machine on saturday |
@Wowfunhappy Yes i do have thoughts. This is a deep rabbit hole. It depends on what exactly you want to report. There's 3 different spaces to consider: window backing store, framebuffer. and scanout. Like most compositing window managers, osx will happily give you a window backing store with any bit depth you like. Internally the window manager (WindowServer) will composite everything down to the framebuffer, and GPU will handle the scanout to the monitor which has its own native panel bit-depth. Life pre-M1 used to be simple, the framebuffer depth matched the max depth reported by the monitor (aka connection depth). With M1, the rabbit hole went so deep. You can see online the thread on ledstrain forum about the explorations of people trying to disable temporal dithering on M1 macs. What they found is that on such macs the internal frame buffer is always 10-bit regardless of what monitor you are connected to, and the GPU itself performs dithering before scanout. In fact, I read somewhere that depending on the connected monitor the GPU may not only perform spatial dithering but temporal dithering as well, effectively simulating FRC on the gpu side. (One thing I have not found too much info about is whether on older osx when framebuffer dpeth = scanout depth, whether WindowServer will internally dither when compositing a high depth backing store (e.g. rbf16) down onto a rgb888 backing store. The 10.11 extended depth GPU sample code mentions it does, so from 10.11+ it's probably true, but before that there's no authoritative proof. You could probably test it out fairly easily but I'm too lazy. It probably must though, since the NeXTStep days appkit allowed you to choose higher depth backing store and presumably back in the day printing to printers that didn't support high depth color was more common so surely osx must do dithering when compositing the individual backing stores. Tried to disassemble CoreGraphics but didn't find anything concrete, there is a dithering shader written in OpenGL but I couldn't trace where that is used.) Also I don't think that snippet of using I think on older osx versions testing depthLimit might have worked, not too sure. One way that definitely works to get framebuffer depth is using private CoreGraphics API to get the display mode. But I think that's overkill, pre 10.11 framebuffer depth is always 8-bit. And post M1-era framebuffer is always 10-bit. So just hardcode it. https://maclabs.jazzace.ca/2018/12/01/colour-deep-and-wide.html Who knows why Apple makes it so hard to find the actual framebuffer depth. You'd think that'd be important information for the application to decide when requesting a backbuffer, to avoid unnecessary dithering. Maybe apple just wants you to use appkit which takes care of requesting the right asset type for you. Maybe they want you to always request a high depth framebuffer and let the OS handle dithering for you. Who knows. You can run
and look at the
|
@krackers I wish I understood this space a little bit better. So, using a link in your links... I guess the core question is, if you plugged pre-10.11 OS X into a 10-bit monitor, would it be possible to correctly view the Adobe RGB photos on https://webkit.org/blog-files/color-gamut/comparison.html? Your link seems to imply it's not possible:
In which case, it would make sense for Firefox Dynasty to just always return But—I find that really surprising! Apple has long catered to professional photographers and video editors and so on. These are the kinds of people who have fancy monitors. Adobe RGB isn't a new thing, it has existed since the late 90s, well before the operating systems we're interested in. Presumably people using Aperture and Final Cut would want to see the content they were editing in 10 bit color. Mavericks also has a built-in display profile for "Adobe RGB". |
You are confusing bit depth with gamut. https://dot-color.com/2011/10/08/the-difference-between-color-gamut-and-bit-depth/ You can do wide gamut over 8-bits, e.g. level 255 on an 8-bit widegamut display is more saturated than 255 on srgb. This does mean of course that the "gaps" between colors is greater when doing 8-bit widegamut than 8-bit srgb. I will get back to that later
(This is my best guess, but empirical evidence is always welcome).
So in short "yes", wide-gamut would be displayed with the full saturation but you will not be able to display all intermediary levels as finely as you could with 10-bit. This is corroborated by https://macperformanceguide.com/AppleCoreRot-10-bit-color.html
(That being said, as recent as 10.7 various parts of the UI blindly assumed standard gamut causing them to look over saturated on a wide-gamut display. https://www.dpreview.com/forums/thread/3538545) Now if you do 8-bits wide gamut, you have a natural issue that the steps between colors are further apart, which is why usually wide-gamut is paired with 10 bits. This is in particular a problem with color space conersion: because most content is srgb, converting color space from srgb to wide-gamut actually introduces quantization error. Think about it, 255 on srgb maps to 234 on P3. But this means that the entire range 0-255 srgb must map to 0-234 on P3. It's like you've only got 7.87 "usable" bits when displaying srgb content. Does not dithering this result in noticeable banding? Well I tried to look online and the answer is apparently not that much. First 7.87 bits is still pretty good, most monitors used to have 6-bits. Secondly, you can do dithering as part of color space conversion (even if we're not technically changing the bit-depth of represented samples, we still lose representable colors in 8-bit srgb vs 8-bit P3). This is why Photoshop has an option to perform dithering as part of colorspace conversion (https://www.corpgraph.com/understanding-color). I believe this is why "professional" monitors internally have LUTs to directly alter their target response, rather than requiring the OS to perform color conversion. E.g. your monitor could accept input in srgb color space, and it would internally do the color space conversion. The benefit is that the monitor itself might have a 10-bit panel, so this avoids the need for the OS to do color-space conversion at 8-bit output. The monitor could also do dithering if needed. https://photo.stackexchange.com/questions/119282/will-there-be-stronger-colour-banding-on-8bit-wide-gamut-display-when-working Even if there's no dithering in the loop as part of colorspace conversion, probably the difference is not enough to be noticeable... Now the real crux of the question is what pre-10.11 osx does. On recent osx, the story is simple, since the framebuffer is 10-bit the colorspace conversion is also done at 10-bits, and then the OS/GPU itself does the dithering down to 8-bits: so we get nice smooth gradients even on dumb monitors. On older osx, the framebuffer itself is 8-bit so the question is really whether Quartz/ColorSync does dithering as part of colorspace transformation. I cannot find a good answer, even though it'd be simple to empirically test (connect <= 10.9 laptop to a wide-gamut monitor using a wide-gamut profile, load an 8-bit srgb band test image and see if it's more bandy than when connected to a regular srgb monitor). My naive guess is that probably it would, since even if 10-bit backbuffer to 8-bit conversion would not be that common, wide to narrow color space conversion certainly would be (e.g. printing, image export, etc.) and if Apple was really going for visual quality they'd be foolish not to use dithering to make the best use of representable colors. Then again, I have seen some suggestions that ColorSync has no ability to perform dithering as part of conversion since apple's own apps do not, and that dithered LUT application. Additionally there is a hint that VCGT LUT correction (which is confusingly an abuse of the ICC profile since it actually "alters" the TRC of the monitors by introducing a correction term) is only done at high bit-depth on more recent versions of osx. https://hub.displaycal.net/forums/topic/macbook-prodisplaycaldatacolor-spyder-5-please-help/
That's the image they have, but when you look at things deeply you realize the rabbit hole goes deep and they deviate from expectations in surprising ways.
All of the above is very hard to piece together and like anything Apple it's hard to get concrete answers on precise behavior, often only way is to empirically validate it. Don't look into quicktime 1.96 gamma... |
i'll get around to doing this later today, but i'm dumping this content here for @krackers so he can answer. krack something weird happens on 10.7 that i think is the root of the "clogging" that occurs. i noticed that if i try to quit when i first open the browser, the prompt shows up and i can terminate without issue. if i create a new window (with at least one existing window), this also holds true. however if i close all open windows, create a new window and then try to quit, the prompt no longer appears and the blocking begins. it seems that lldb/gdb are catching an EXC_BAD_ACCESS: Screen.Recording.2025-02-01.at.7.23.25.AM.mp4https://pastebin.mozilla.org/4gq4rbnL#L78 this paint code was introduced after they dropped support for 10.6-10.8, however it does work fine on 10.8 so i'm wondering what the problem could be. |
@krackers Thanks, super interesting! So for Firefox Dynasty, it does seem like hardcoding anything < 10.11 to return 24, and anything ≥ 10.11 to return 30, would be reasonable. |
actually i thought about it more after i wrote that i'd change it, but i decided against it. the reason is that, in my experience, 10.8 does support some kind of 10 bit colour where as 10.7 does not. i learned this because 10.8 does not have problems showing videos on biplanar iosurfaces, whereas 10.7 can only support single planar. whilst APPUL's documentation claims that kCVPixelFormatType_420YpCbCr10BiPlanarFullRange exists from 10.4+, i don't think this is true. https://developer.apple.com/documentation/corevideo/kcvpixelformattype_420ypcbcr10biplanarfullrange it seems at least 10.8 and upwards have no problems with these constants, but 10.7 seems to just show white on any IOSurface that uses kCVPixelFormatType_420YpCbCr10BiPlanarFullRange or kCVPixelFormatType_420YpCbCr8BiPlanarFullRange (10 and 8 bit) now don't get me wrong, underneath it may be 8 bit, but i just don't see a reason to add hacks to the code where things look okay. i do appreciate the theory and deep dive by @krackers and his own postulate suggests that pre 10.11 there is some kind of dithering or proper back conversion:
as @RJVB: it's doubtful there is any impact and you'd need to present that before i consider an otherwise-unneeded modification. |
On Sunday February 02 2025 07:13:01 gagan sidhu wrote:
however if i close all open windows, create a new window and then try to quit, the prompt no longer appears and the blocking begins.
Only if you first closed all open windows?
FWIW I had a comparable issue with a previous release (the first 132 release?) where the top parent process would hang after running a more complex session with multiple windows. I did get the "really quit?" prompt though.
|
Sorry, what?? That can't have been in reference to my comment above and I don't see where I'd suggest any modification here. |
it was supposed to say “as @RJVB said:”
as in, i don’t think any changes are needed because the change would likely make no, or little, difference.
and for me to make that change, he needs to show the existing way is problematic.
… On Feb 2, 2025, at 11:20 AM, René Bertin ***@***.***> wrote:
as @RJVB <https://github.com/RJVB>: it's doubtful there is any impact and you'd need to present that before i consider an otherwise-unneeded modification.
Sorry, what?? That can't have been in reference to my comment above and I don't see where I'd suggest any modification here.
—
Reply to this email directly, view it on GitHub <#29 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AITLEGZ2IFWVZOJAVNNUVZD2NZOXFAVCNFSM6AAAAABV5DLWTWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMMRZGUYDCMZTGY>.
You are receiving this because you commented.
|
^ In other words, we'd want to find a website that breaks when screen.colorDepth is set to 30 before changing this, right? That makes sense to me—since, yes, it may not exist! I'll be on the lookout! |
that is correct.
not even 'breaks', but ‘an unrepresentative depiction of the coloured image'
Thanks,
Gagan
… On Feb 2, 2025, at 11:28 AM, Jonathan ***@***.***> wrote:
^ In other words, we'd want to find a website that breaks when screen.colorDepth is set to 30 before changing this, right? That makes sense to me—since, yes, it may not exist! I'll be on the lookout!
—
Reply to this email directly, view it on GitHub <#29 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/AITLEG4LS3SZRAK5RH76EJ32NZPULAVCNFSM6AAAAABV5DLWTWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMMRZGUYDGOBYGY>.
You are receiving this because you commented.
|
This is not necessarily a bug, I do not understand the problem space well enough!
Running
console.log(screen.colorDepth)
while using a standard monitor with 32-Bit color would usually be expected to print24
(the bits used for the alpha channel aren't counted).However, in Firefox Dynasty, the above code prints
30
on both my Desktop (connected to what is, as far as I know, a completely boring 32-Bit monitor) and my Macbook Air (using the internal LCD screen).Is this expected behavior? I can't tell, it seems odd.
P.S. I am in fact quite interested in whether Mavericks could drive an HDR display, the HDR wouldn't work but I think the added color depth might? Something to try some day...
The text was updated successfully, but these errors were encountered: