Comment by hmage

5 years ago

Modern game engines buffer 3 or 4 frames, sometimes 5. Not unusual to have 140ms latency on 60hz screen between clicking mouse1 and seeing the muzzle flash.

  * deferred vs forward rendering (deferred adds latency)
  * multithreaded vs singlethreaded
  * vsync (double buffering)

https://www.youtube.com/watch?v=8uYMPszn4Z8 -- check at 6:30 the latency of 60fps vsync on 60hz. It's not even close to 16ms (1/60), it's ~118ms (7.1/60).

It's far cry from simplified pure math people think of when they think of fps in games or refresh rate for office and typing. Software is very very lazy lately, and most of the time these issues are being fixed by throwing more hardware at it, not fixing the code.

> Software is very very lazy lately, and most of the time these issues are being fixed by throwing more hardware at it, not fixing the code.

Some things cannot be 'fixed'. It's always a trade-off. You can't expect to have all the fancy effects that rely on multiple frames and also low latency.

If there was a simple software fix, GPU manufacturers would be all over it and pushing it to all engines. It's in their interests to have the lowest latency possible to attract the more hard-core gamers (which then influence others).

Just look at all the industry cooperation that had to happen to implement adaptive sync. That goes all the way from game developers, engines, GPUs, monitors. Sure that sells more hardware(which brings other benefits), but a software-only approach would also allow companies to sell hardware, by virtue of their "optimized" drivers.

> * deferred vs forward rendering (deferred adds latency)

Wah? Deferred just refers to a screen space shading technique but it still happens once every frame.

> * multithreaded vs singlethreaded

Not sure what you're saying here.

And then of course, yes display buffering does have an impact.