← Back to context

Comment by KapKap66

3 days ago

There's a problem when people who aren't very sensitive to latency and try and track it, and that is that their perception of what "instant" actually means is wrong. For them, instant is like, one second. For someone who cares about latency, instant is less than 10 milliseconds, or whatever threshold makes the difference between input and result imperceptible. People have the same problem judging video game framerates because they don't compare them back to back very often (there are perceptual differences between framerates of 30, 60, 120, 300, and 500, at the minimum, even on displays incapable of refreshing at these higher speeds), but you'll often hear people say that 60 fps is "silky smooth," which is not true whatsoever lol.

If you haven't compared high and low latency directly next to each other then there are good odds that you don't know what it looks like. There was a twitter video from awhile ago that did a good job showing it off that's one of the replies to the OP. It's here: https://x.com/jmmv/status/1671670996921896960

Sorry if I'm too presumptuous, however; you might be completely correct and instant is instant in your case.

Sure, but there's not limit to what people can decide to care about. There will always be people who want more speed and less latency, but the question is: are they right to do so?

I'm with the person you're responding. I use the regular suite of applications and websites on my 2021 M1 Macbook. Things seem to load just fine.

> For someone who cares about latency, instant is less than 10 milliseconds

Click latency of the fastest input devices is about 1ms and with a 120Hz screen you're waiting 8.3ms between frames. If someone is annoyed by 10ms of latency they're going to have a hard time in the real world where everything takes longer than that.

I think the real difference is that 1-3 seconds is completely negligible launch time for an app when you're going to be using it all day or week, so most people do not care. That's effectively instant.

The people who get irrationally angry that their app launch took 3 seconds out of their day instead of being ready to go on the very next frame are just never going to be happy.

  • I think you're right, maybe the disconnect is UI slowness?

    I am annoyed at the startup time of programs that I keep closed and only open infrequently (Discord is one of those, the update loop takes a buttload of time because I don't use it daily), but I'm not annoyed when something I keep open takes 1-10s to open.

    But when I think of getting annoyed it's almost always because an action I'm doing takes too long. I grew up in an era with worse computers than we have today, but clicking a new list was perceptibly instant- it was like the computer was waiting for the screen to catch up.

    Today, it feels like the computer chugs to show you what you've clicked on. This is especially true with universal software, like chat programs, that everyone in an org is using.

    I think Casey Muratori's point about the watch window in visual studio is the right one. The watch window used to be instant, but someone added an artificial delay to start processing so that the CPU wouldn't work when stepping fast through the code. The result is that, well, you gotta wait for the watch window to update... Which "feels bad".

    https://www.youtube.com/watch?v=GC-0tCy4P1U

I fear that such comments are similar to the old 'a monster cable makes my digital audio sound more mellow!'

The eye percieves at about 10 hz. That's 100ms per capture. All the rest, I'd have to see a study that shows how any higher framerate can possibly be perceived or useful.

  • Well if you believe that, start up a video game with a framerate limiter and set your game's framerate limit to 10 fps and tell me how much you enjoy the experience. By default your game will likely be running at either 60 fps or 120 fps if you're vertical synced (depends on your monitor's refresh rate). Make sure to switch back and forth between 10 and 60/120 to compare.

    Even your average movie captures at 24 hz. Again, very likely you've never actually just compared these things for yourself back to back, as I mentioned originally.

  • >The eye percieves at about 10 hz. That's 100ms per capture. All the rest, I'd have to see a study that shows how any higher framerate can possibly be perceived or useful.

    It takes effectively no effort to conduct such a study yourself. Just try re-encoding a video at different frame rates up to your monitor refresh rate. Or try looking at a monitor that has a higher refresh rate than the one you normally use.

  • > The eye perceives at about 10 hz.

    Not sure what this means; the eye doesn’t perceive anything. Maybe you’re thinking of saccades or round-trip response times or something else? Those are in the ~100ms range, but that’s different from whether the eye can see something.

    This paper shows pictures can be recognized at 13ms, which is faster than 60hz, and that’s for full scenes, not even motion tracking or small localized changes. https://link.springer.com/article/10.3758/s13414-013-0605-z

  • Modern operating systems run at 120 or 144 hz screen refresh rates nowadays, I don't know if you're used to it yet but try and go back to 60, it should be pretty obivous when you move your mouse.