Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why the recommendation to use sRGB default? #32

Open
CrHasher opened this issue Sep 11, 2022 · 0 comments
Open

Why the recommendation to use sRGB default? #32

CrHasher opened this issue Sep 11, 2022 · 0 comments

Comments

@CrHasher
Copy link

Read all the information from the DisplayCAL forum and I still don't understand the recommendation to use the Windows built in sRGB profile as the display profile.

dwm_lut applies the lut to everything displayed on screen but color managed apps would still benefit from an sRGB profile created for the display with DisplayCAL no?

My reasoning is a program needs to know some display characteristics to render stuff correctly otherwise they assume some standard and a sort of reference display and squeeze all pixels displayed into that standard. In most cases they assume a 1:1000 contrast a gamma 2.2 or sRGB tone curve ..., obviously a display is from factory most of the time different than this assumed common denominator standard.

So wouldn't it make sense to create a display profile aka. sRGB in DisplayCAL use it as default ICC profile generate 3DLUT from it with a preferred gamma of 2.2 as an example and apply the 3DLUT with dwm_lut?

My only concern with this approach is vcgt that might be applied 2x (Display CAL profile loader + dwm_lut), not sure if it would be the case. I could set tone curve as measured in DisplayCAL when creating the profile to avoid this.

What are your thoughts on this topic?

My aim is to give apps info about the display like contrast white point and so on via ICC so they don't assume stuff while globally applying a 3dlut on top.

It would just be something like a pixel value going from:
0.2343 -> 0.2143 (what the app thinks it should be in sRGB --> 0.2142 (what the actual lut applied would show at the end)
My concern is app doing this
In file 0.2343 -> 0.2452 (some assumptions in the app without profile) -> 0.2243 final value based on wrong value from the app when dwm_lut is applied.

Hope you get the example and looking forward to your answer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant