-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ FF7 ] NTSC-J Gamut mode dithers the entire screen. #757
Comments
Thanks for the report. This was implemented by @ChthonVII so I'd appreciate a feedback from him on the matter before touching anything around NTSC-J. |
i've done some testing and it is the dithering in the ffnx.post.frag that seems to cause this issue, and disabling it there seems to improve the image quality. |
a bit more testing reveals that the gamut conversion does introduce banding in the menu gradients, which would indeed need the dithering to fix. The gamut converter thingie applies the same dither mask, whether it's needed or not. While such dithering will indeed fix banding, it's still most likely wrong to apply it in post processing. it should be applied only to the original texture itself, and the gamut conversion should be done there, and nowhere else. This is particularly obvious when you use non upscaled assets, and you can see a pattern inside of the clearly visible pixels. I also believe it wrong to apply it to the gradient filled dialog and menu boxes, because those are generated in srgb. the correct thing to do should be to gamut convert the default colors, and then not modify the generated gradient. NTSC-J gamut mode should only cause a conversion in stuff that needs it, and only once. In fact there is a stupidly common use case of upscaled textures and changed models, which makes both choices wrong. I also don't believe gamut converting upscales to actually be correct, since the upscaling process introduces new data in the wrong gamut. the correct thig to do is probably to convert the original asset, only dither parts that are improved by the dither, and then AI upscale and tweak. as a computer player close to my flat panel, dithering is very visible. for someone playing on a couch in big picture, it is probably not noticeable. This bears out my results when i disable it in post. I can only spot banding in the dialog box gradients. |
|
The post pass dither assumes that the individual pixels aren't visible on your screen. if they aren't, yeah, you can't really see it. but people who are gaming on a budget are likely to have the sort of monitor i do, which is only 23.6 inches diagonal, but has 1080p native resolution, so it's only 93 dpi or so. and the dithering is extremely visible in the low end. not everyone has retina displays, after all. :) I'm also amused that my thought was that it was put in to fix the nasty banding on the opening video was correct. I do not remember said banding being visible on an actual tv, and when i actually used a tv out to output the pc to an actual standard definition CRT TV, sure enough, it looked JUST like i remembered it on the actual playstation. This tells me the proper solution is just to slap a good crt shader on the output to get the pixels to smear enough to hide the dithering again. |
Yeah, the minimum viewing distance for a screen like that is something like 3-4 feet. That's why you're able to see the dithering. You'd be better off removing the dithering than using a CRT shader. Those shaders really distort the color. Replacing the dithering with rounding would guarantee colors are at most a half step off. (I do have a draft retroarch CRT shader that doesn't distort color, assuming adequate viewing distance, and further assuming your monitor accurately implements sRGB gamma. So, unfortunately, that probably wouldn't help you.) The nasty banding near black in the opening video is due to a gamma problem. I fixed that with a custom gamma function, but I want to go back and try a more principled fix. There's also lots of plain old banding in all the videos that dithering helps with to some extent. |
Describe the bug
Enabling NTSC-J gamut applies a fullscreen quasirandom dither, when things would most likely look better without it.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
The game to look like it should on a Japanese NTSC tv with yellow boost, with no added dithering.
Screenshots
These are with syw unified upscaled field and ninostyle chibi.
The dithering is highly visible at lower DPI on a flatpanel.
GPU:
But this isn't a driver issue, as i can see the code in the source that dithers the entire screen whenever a gamut conversion is done.
Additional context
Dithering is only needed to hide visible quantization noise, and shouldn't be needed for gamut conversions at 8 bit per channel of stuff with no banding. I have no trouble changing gamut in photoshop 2 without introducing dithering or banding.
I suggest making dither_gamut_conversions an option, so the user can decide if it increases or decreases quality.
The text was updated successfully, but these errors were encountered: