Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems with the dithering palette calculation #188

Open
j4james opened this issue Feb 16, 2025 · 0 comments
Open

Problems with the dithering palette calculation #188

j4james opened this issue Feb 16, 2025 · 0 comments

Comments

@j4james
Copy link

j4james commented Feb 16, 2025

When converting an image to sixel, we need to create a palette that best represents all the colors in the image before performing the dithering. I noticed two problems with this process.

  1. If you create a small 6x6 image, just two colors, a single pixel border of red, and the rest white, img2sixel will end up converting that into a block of solid red. This is the fault of the computeHistogram function, which is supposed to gather the most frequently used colors in the image, but in this case just returns one (i.e. just the red).

  2. There is a color quantization quality mode, that can be low, high, or full. That mode determines how many samples the computeHistogram gathers when calculating the color palette. However, the low and high options are exactly the same. This used to work, but it looks like the sample size was mistakenly updated during some refactoring (see commit e40bf2d).

The patch below solves both these issues:

diff --git a/src/quant.c b/src/quant.c
index 4490edf..aaf3822 100644
--- a/src/quant.c
+++ b/src/quant.c
@@ -691,23 +691,17 @@ computeHistogram(unsigned char const    /* in */  *data,
     switch (qualityMode) {
     case SIXEL_QUALITY_LOW:
         max_sample = 18383;
-        step = length / depth / max_sample * depth;
         break;
     case SIXEL_QUALITY_HIGH:
-        max_sample = 18383;
-        step = length / depth / max_sample * depth;
+        max_sample = 1118383;
         break;
     case SIXEL_QUALITY_FULL:
     default:
         max_sample = 4003079;
-        step = length / depth / max_sample * depth;
         break;
     }
 
-    if (length < max_sample * depth) {
-        step = 6 * depth;
-    }
-
+    step = length / depth / max_sample * depth;
     if (step <= 0) {
         step = depth;
     }

The first issue is solved by the removal of the test that sets the step to 6 * depth, when an image is smaller than the max sample size. So in my example above, you would end up with an unnecessarily large step for a tiny image, so very few pixels were sampled (in that particular case, just the left border of the image).

I'm almost certain there was no need for that code. I think it was added in a misguided attempt to fix a bug that was actually caused elsewhere. There's already another check that makes sure the step isn't zero, and that's all we really need here.

At the same time I moved he step calculation down out of the switch statement, because it was identical in every case. And I updated the max_sample value for the SIXEL_QUALITY_HIGH case to fix the second issue. I've just used the value it was originally set to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant