Comment by mark-r

2 days ago

A good scaling algorithm would take Nyquist limits into account. For example if you're using bicubic to resize to 1/3 the original size, you wouldn't use a 4x4 grid but a 12x12 grid. The formula for calculating the weights is easily stretched out. Oh and don't forget to de-gamma your image first. It's too bad that good scaling is so rare.

Yeah, it seems that a lot of this is due to marginal quality resampling algorithms that allow significant amounts of aliasing. The paper does mention that even a good algorithm with proper kernel sizing can still leak remnants due to quantization, though the effect is greatly diminished.

I'm surprised that such well known libraries are still basically using mipmapping, proper quality resampling filters were doable on real-time video on CPUs more than 15 years ago. Gamma correction arguably takes more performance than a properly sized reduction kernel, and I'd argue that depending on the content you can get away without that more often than skimping on the filter.