Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ FF7 ] NTSC-J Gamut mode dithers the entire screen. #757

Open
zaphod77 opened this issue Dec 1, 2024 · 6 comments
Open

[ FF7 ] NTSC-J Gamut mode dithers the entire screen. #757

zaphod77 opened this issue Dec 1, 2024 · 6 comments
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@zaphod77
Copy link
Contributor

zaphod77 commented Dec 1, 2024

Describe the bug
Enabling NTSC-J gamut applies a fullscreen quasirandom dither, when things would most likely look better without it.

To Reproduce
Steps to reproduce the behavior:

  1. set enable_ntscj_gamut_mode to true in ffnx.toml
  2. install some mods with upscales.
  3. Start game
  4. Start a new game.
  5. Skip the FMV to witness the problem.

Expected behavior
The game to look like it should on a Japanese NTSC tv with yellow boost, with no added dithering.

Screenshots
These are with syw unified upscaled field and ninostyle chibi.

image
image
The dithering is highly visible at lower DPI on a flatpanel.
GPU:

  • Nvidia GTX 1060 3GB
  • Driver Version 561.03 (current)

But this isn't a driver issue, as i can see the code in the source that dithers the entire screen whenever a gamut conversion is done.

Additional context
Dithering is only needed to hide visible quantization noise, and shouldn't be needed for gamut conversions at 8 bit per channel of stuff with no banding. I have no trouble changing gamut in photoshop 2 without introducing dithering or banding.

I suggest making dither_gamut_conversions an option, so the user can decide if it increases or decreases quality.

@zaphod77 zaphod77 added the bug Something isn't working label Dec 1, 2024
@julianxhokaxhiu
Copy link
Owner

Thanks for the report. This was implemented by @ChthonVII so I'd appreciate a feedback from him on the matter before touching anything around NTSC-J.

@julianxhokaxhiu julianxhokaxhiu added the help wanted Extra attention is needed label Dec 1, 2024
@zaphod77
Copy link
Contributor Author

zaphod77 commented Dec 5, 2024

i've done some testing and it is the dithering in the ffnx.post.frag that seems to cause this issue, and disabling it there seems to improve the image quality.

@zaphod77
Copy link
Contributor Author

zaphod77 commented Dec 7, 2024

a bit more testing reveals that the gamut conversion does introduce banding in the menu gradients, which would indeed need the dithering to fix.

The gamut converter thingie applies the same dither mask, whether it's needed or not. While such dithering will indeed fix banding, it's still most likely wrong to apply it in post processing. it should be applied only to the original texture itself, and the gamut conversion should be done there, and nowhere else. This is particularly obvious when you use non upscaled assets, and you can see a pattern inside of the clearly visible pixels. I also believe it wrong to apply it to the gradient filled dialog and menu boxes, because those are generated in srgb. the correct thing to do should be to gamut convert the default colors, and then not modify the generated gradient.

NTSC-J gamut mode should only cause a conversion in stuff that needs it, and only once. In fact there is a stupidly common use case of upscaled textures and changed models, which makes both choices wrong.

I also don't believe gamut converting upscales to actually be correct, since the upscaling process introduces new data in the wrong gamut. the correct thig to do is probably to convert the original asset, only dither parts that are improved by the dither, and then AI upscale and tweak.

as a computer player close to my flat panel, dithering is very visible. for someone playing on a couch in big picture, it is probably not noticeable.

This bears out my results when i disable it in post. I can only spot banding in the dialog box gradients.

@ChthonVII
Copy link
Contributor

  1. What's the DPI on that flatpanel? I think this might be more of a low-DPI issue than a dithering issue. I don't see any dithering artifacts on my monitor. For that matter, I can't see the dithering artifacts in your screenshots unless I zoom it at least ~2x.
  2. Dithering is the "correct" procedure following an operation that doesn't land cleanly on a quantization step. Not just because we might need to hide quantization artifacts, but because the color we're trying to represent is between quantization steps and dithering is the only tool available for representing such colors (aside from buying a HDR monitor). That said, if it looks better on your screen without dithering, then who cares about being correct? Go ahead an add an option to toggle dithering if you like.
  3. I agree that applying gamut conversion to all modded assets isn't ideal. This was something that was discussed at great length when NTSC-J mode was introduced. The current implementation is "all or none" because a selective implementation was insanely more work than anyone was willing to put it. It would require reworking both 7th Heaven and FFNx to tag every asset with metadata throughout the whole pipeline so that the shader knew what gamut (and also what gamma function) something was supposed to use. That's a feature I'd really like to have, so you have my full encouragement to go implement it. (Also, gamut conversion is often "less wrong" for modded assets than not doing it because often the modder used an original asset incorrectly displayed as sRGB as a color reference.)
  4. Another issue that I get the feeling you may not understand is that gamut correction needs to come very late in the pipeline because it's an aspect of the CRT, not the PS1. So everything that happens in the PS1 -- alpha blending, palette effects, lighting, etc. -- all has to happen before gamut conversion. Doing those PS1 operations on corrected colors would lead to really wrong results. That's why we can't just correct hardcoded colors like text boxes as the very first step.
  5. Gamut converting upscaled images is very close to correct. The interpolated pixels are generally miscolored in the same way as the original pixels. There's a small gamma error in the upscaler's blending because it's using the wrong operations to linearize. You could slightly improve matters by gamut converting and gamma correcting the originals, then doing the AI upscale, then displaying the result as sRGB. But that would be a lot of work, and the change might not even be perceptible.
  6. When I get around to it, I need to completely redo NTSC-J mode. Thanks to a lot of research and some help from a brilliant chap on the retroarch forums, I've now got some reasonably solid information on the "color correction" circuits and actual phosphor chromaticities of mid-90s Japanese CRTs. I could now do a meaningfully better job of matching their colors. I've also got a plausible CRT gamma function that's more principled than the "at least it looks good" function I created to fix the video banding, that I want to try across the board (though it will be a pain because I either need to add selectable gamma to every asset load, or convince julianxhokaxhiu to use CRT gamma for everything all the time). I've still got some work to finish up on gamutthingy before I start that. (Also, I'm really deeply dreading having to touch vcpkg again.)
  7. Part of that rework will be moving up to 128x128x128 LUTs and storing the LUTs in sRGB so that more bits are dedicated to dark colors. That should help with banding in dark blues, even without dithering.
  8. Another part of the rework will be using PS1-output-gamma-space R'G'B' for the LUT indices, since the CRT color correction circuit produces out-of-bounds values that would be difficult to deal with otherwise. (For the most part they aren't truly out of bounds. Rather, the issue is that the range of inputs isn't really coextensive with the phosphor gamut. (There's shit like "super black" and 110% white and stuff like that.) But we need to quantify them as out of bounds (and use the gamut compression function to pull them in bounds) because the alternative is to set "black" and "white" at grey points pretty far inside the phosphor gamut, and that would look really ugly.) (Together items 7 & 8 necessitate adding some gamma correction to the LUT interpolation function.)

@zaphod77
Copy link
Contributor Author

zaphod77 commented Dec 9, 2024

The post pass dither assumes that the individual pixels aren't visible on your screen. if they aren't, yeah, you can't really see it. but people who are gaming on a budget are likely to have the sort of monitor i do, which is only 23.6 inches diagonal, but has 1080p native resolution, so it's only 93 dpi or so. and the dithering is extremely visible in the low end. not everyone has retina displays, after all. :)

I'm also amused that my thought was that it was put in to fix the nasty banding on the opening video was correct. I do not remember said banding being visible on an actual tv, and when i actually used a tv out to output the pc to an actual standard definition CRT TV, sure enough, it looked JUST like i remembered it on the actual playstation.

This tells me the proper solution is just to slap a good crt shader on the output to get the pixels to smear enough to hide the dithering again.

@ChthonVII
Copy link
Contributor

Yeah, the minimum viewing distance for a screen like that is something like 3-4 feet. That's why you're able to see the dithering.

You'd be better off removing the dithering than using a CRT shader. Those shaders really distort the color. Replacing the dithering with rounding would guarantee colors are at most a half step off. (I do have a draft retroarch CRT shader that doesn't distort color, assuming adequate viewing distance, and further assuming your monitor accurately implements sRGB gamma. So, unfortunately, that probably wouldn't help you.)

The nasty banding near black in the opening video is due to a gamma problem. I fixed that with a custom gamma function, but I want to go back and try a more principled fix. There's also lots of plain old banding in all the videos that dithering helps with to some extent.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants