You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When converting an image to sixel, we need to create a palette that best represents all the colors in the image before performing the dithering. I noticed two problems with this process.
If you create a small 6x6 image, just two colors, a single pixel border of red, and the rest white, img2sixel will end up converting that into a block of solid red. This is the fault of the computeHistogram function, which is supposed to gather the most frequently used colors in the image, but in this case just returns one (i.e. just the red).
There is a color quantization quality mode, that can be low, high, or full. That mode determines how many samples the computeHistogram gathers when calculating the color palette. However, the low and high options are exactly the same. This used to work, but it looks like the sample size was mistakenly updated during some refactoring (see commit e40bf2d).
The first issue is solved by the removal of the test that sets the step to 6 * depth, when an image is smaller than the max sample size. So in my example above, you would end up with an unnecessarily large step for a tiny image, so very few pixels were sampled (in that particular case, just the left border of the image).
I'm almost certain there was no need for that code. I think it was added in a misguided attempt to fix a bug that was actually caused elsewhere. There's already another check that makes sure the step isn't zero, and that's all we really need here.
At the same time I moved he step calculation down out of the switch statement, because it was identical in every case. And I updated the max_sample value for the SIXEL_QUALITY_HIGH case to fix the second issue. I've just used the value it was originally set to.
The text was updated successfully, but these errors were encountered:
When converting an image to sixel, we need to create a palette that best represents all the colors in the image before performing the dithering. I noticed two problems with this process.
If you create a small 6x6 image, just two colors, a single pixel border of red, and the rest white, img2sixel will end up converting that into a block of solid red. This is the fault of the
computeHistogram
function, which is supposed to gather the most frequently used colors in the image, but in this case just returns one (i.e. just the red).There is a color quantization quality mode, that can be low, high, or full. That mode determines how many samples the
computeHistogram
gathers when calculating the color palette. However, the low and high options are exactly the same. This used to work, but it looks like the sample size was mistakenly updated during some refactoring (see commit e40bf2d).The patch below solves both these issues:
The first issue is solved by the removal of the test that sets the step to
6 * depth
, when an image is smaller than the max sample size. So in my example above, you would end up with an unnecessarily large step for a tiny image, so very few pixels were sampled (in that particular case, just the left border of the image).I'm almost certain there was no need for that code. I think it was added in a misguided attempt to fix a bug that was actually caused elsewhere. There's already another check that makes sure the step isn't zero, and that's all we really need here.
At the same time I moved he step calculation down out of the switch statement, because it was identical in every case. And I updated the
max_sample
value for theSIXEL_QUALITY_HIGH
case to fix the second issue. I've just used the value it was originally set to.The text was updated successfully, but these errors were encountered: