Comment by im3w1l
5 years ago
I would not define it exactly like that. I would say "Dithering is any method of reducing the bit depth of a signal that prioritizes accurate representation of low frequencies over that of high frequencies". This frames it as essentially an optimization problem, with randomn noise being a heuristic way of accomplishing it.
I feel like that is still a very narrow definition. Dithering's useful anywhere quantization produces an unwanted result, and is useful in a lot of places where "bitrate" isn't even a concept
Good image dithering algorithms do maintain sharp features like edges.