← Back to context

Comment by bscphil

5 years ago

I think the more important fact is that people can be affected by small amounts of latency, even if they can't react that quickly or perhaps even discern that latency is occurring.

The obvious example here is a precision platformer like Celeste, but you can say the same (with less and less applicability) to other games, starting with FPS.

In Celeste, there are a handful of frame-perfect inputs in the game. This means you have less than a 20 ms window to get your input in, or you're dead (the game's only failure state). How is this possible, if human reaction time is only ~100 ms at best? It's because there's a difference between reaction time and timing. Reaction time measures your time-to-react to an unpredictable stimulus. Timing is your reaction to a predictable stimulus. Most of the time in games you are reacting to a stimulus that is at least somewhat predictable.

So with a little training you can reliably make that frame perfect jump. But if Stadia adds 60 ms of latency, that means your character is over 3 frames ahead of where you think she is. You're going to miss that jump a lot until you can reprogram your brain to account for the latency, as much as possible. And even then you'll probably find it harder. Throw in a little variability to the latency, so you think the character is 3 frames behind but she's actually 4, and you're doomed.

Granted, not every game is a precision platformer, so there are diminishing returns for low latency in other types of game. But if you, say, enable cross-play between Stadia and non-Stadia in a shooter, the local players are probably going to have a huge advantage. Even making it work against an AI opponent would require some significant work to make the AI's reaction time keyed to Stadia's measurement of latency, not whatever you originally hard-coded into the game.