Comment by AlotOfReading
2 days ago
It's just realtime programming. I wouldn't say that realtime techniques are limited to a very narrow range of ultra critical systems, given that they encompass everything from the code on your SIM card to games in your steam library.
In large, distributed systems the best we're looking for is statistically acceptable. You can always tailor a workload that will break a guarantee in the real world.
This is called "soft" realtime.
"Soft" realtime just means that you have a time-utility function that doesn't step-change to zero at an a priori deadline. Virtually everything in the real world is at least a soft realtime system.
I don't disagree with you that it's a realtime problem, I do however think that "just" is doing a lot of work there.
There are multiple ways to deal with deadline misses for soft systems. Only some of them actually deliver the correct data, just late. A lot of systems will abort the execution and move on with zeros/last computed data instead, or drop the data entirely. A modern network AQM system like CAKE uses both delayed scheduling and intelligent dropping.
Agreed though, "just" is hiding quite a deep rabbit hole.