← Back to context

Comment by amai

7 months ago

> TrueType fonts have had a Turing complete virtual machine (almost?) since the beginning. It is used for "hinting" to allow partially colored pixels at low resolutions to remain legible. It's basically a program that decides whether to color a pixel or not to allow fine tuning of low resolution rasterization.

That sounds like an awful idea, too. I think a font file should describe the fonts form, but it should not describe how it is gonna be rendered. That should be up to the render engine of the device that is going to display the font (printer driver, monitor driver...). But I guess this idea is from a time when people were still using bitmap fonts.

You need to remember that when TrueType fonts were introduced, a typical display resolution was 640x480 to 800x600 on a 13-15" display.

If you rasterize the Bezier curve outline of a ttf font at that resolution, you will have very crooked characters without anti aliasing and very blurry with AA.

At the same time the same font files needed to look good on print paper with a very different DPI setting.

It's a compromise between bitmap and outline fonts. Not ideal but it delivered good results on display and on paper at the time.

The hinting engine is not (?) used that much any more with large resolutions where we can comfortably just rasterize the outline with some AA and have good results.