← Back to context

Comment by troupo

12 hours ago

> That post doesn't say that it takes 16ms to create a scene and have the terminal rasterize and try and present it.

So they literally take 16ms to rasterize just a few hundred characters on screen. Of those, 11ms are spent in "React scene graph", and they have 5ms to do the extremely complex task of rendering a few characters.

16ms is an eternity. A game engine renders thiusands if complex 3D objects in less time. You can output text to terminal at hundreds of frames per second in Javascript: https://youtu.be/LvW1HTSLPEk?si=G9gIwNknqXEWAM96

> and if the terminal has to go and load a font from disk to be able to rasterize something that can eat into the budget

Into which budget? They spend 11ms "laying out a scene" for a few hundred characters. "Reading something from disk" to render something is a rare event. And that's before we start questioning assumptions about read speeds [1], whether something needs to be rendered in a TUI at 60fps etc.

[1] Modern SSDs can probably load half of their contents into cache before you even begin to see the impact on frames. Unless it's a Microsoft terminal for which they claim they need a PhD to make it fast.

>So they literally take 16ms to rasterize just a few hundred characters on screen

Did you measure this yourself? Where is this number coming from? I am talking about a budget. Even if it takes 1ms total as long as that is under 16 ms that is fine.

More on disk read speed: https://planetscale.com/blog/io-devices-and-latency

--- start quote ---

A typical random read [on a HDD] can be performed in 1-3 milliseconds.

A random read on an SSD varies by model, but can execute as fast as 16μs (μs = microsecond, which is one millionth of a second).

--- end quote ---

If you drop frames in a TUI on a HDD/SDD read for a font file (10-20 KB), you're likely completely incompetent.