← Back to context

Comment by Jonnax

5 years ago

Anyone that has used a 120Hz+ display will immediately notice that everything feels more responsive even with simple things like moving your mouse and scrolling.

I also find the responsiveness noticeable when typing. You'll be able to tell that the text appears faster on the screen.

I wouldn't say it's a requirement. But it does make the user experience nicer.

Now if only someone made a 34” ultra wide with 120Hz+. In particular one with 5120 x 2160 resolution. Maybe in 2022 at the rate high hertz monitors are coming for non-gaming uses...

  • I agree. I'd rather hi-dpi 32"+ monitors first though. Hidpi displays are becoming standard on top-end laptops but on desktop you're almost out of luck.

    It's wild that the only hidpi 32" monitor is ~$5000, and it was released a few years ago now.

    My eyeballs need that sweet crisp text.

    • What are you calling "hi-dpi"? There's a ton of 4k monitors in the 27" to 32" range, which are definitely higher DPI than typical. And for a lot less than $5000.

      35 replies →

    • There was some really dark times around 2002 to 2012 where good CRTs became unavailable and almost every LCD screen was "HD" meaning 1080p. Still at the tail end of these times, I guess.

      2 replies →

  • Monitors like LG27GL850 are 1440p 144hz with great color repro. Not ultra-wide, but ultra-wide is not something I personally value

    • I currently use multiple monitors, and apparently nobody at Apple HQ uses a monitor in portrait mode. Catalina routinely forgets the monitor is portrait and requires manually changing it back _every time the machine is locked_.

      Why is this relevant? I'm seriously considering a 1440p 32:9 240hz Samsung G9 so that I can ditch multiple monitors and move to a single display with similar overall screen space just to dodge pesky multi-screen bugs like this. Just docking my laptop and using a massive screen would be SO NICE!

      Only thing that's holding me back is I really want 2160p tall in that form factor. Will probably need to wait for DisplayPort2.0

    • I personally can't stand the ultrawides. OS support for a huge monitor just isn't where I'd want it to be. Sure I can install applications to handle window management better and whatnot, but even trivial things like fullscreen video don't work well at all.

      I'm a huge fan of the LG27GN950. 4k 144hz glory.

  • There are plenty of ultra-wide monitors with greater than 120Hz refresh. Just not with the resolution you are asking for.

    Or at the size you are looking for.

    Samsung 49-Inch CHG90 144Hz

    It's a 49 inch, so not the same PPI. Different aspect ration too (32:9, so 5120x1440).

  • I'm looking for something similar, but not ultrawide. There's plenty of 27" 4k 100Hz+ monitors, but not the same in 30-36" sizes. I have 3x 24" monitors, and would like to replace one or two of them with one bigger screen that's higher resolution and faster.

    • Yeah I'm on an iMac Pro (which is the same display as the 27" iMacs) as my main display at its pretty glorious with 218 PPI. My older 34" ultra wide next to it is really showing its age, both in terms of overall quality and 110 PPI. The newer LG 34" that is 5K2K is not horrible at 164 psi.

      If Apple just sold there 27" display for 1500 bucks or something they would make a killing for folks who want a really nice pro display but don't need the overkill of the Pro Display XDR.

  • My current dream monitor is a 42inch 3840x2400 or 7680x4800 - 120hz screen. I love the 16:10 aspect ratio and real estate of a monitor like that. I’m currently running 3 24inch monitors in portrait mode, so 3600x1920

    • I am still mad that all display vendors went 16:9. The black bars are fine when watching movies if you mostly use the display for productive work, whereas 4:3->16:9 was a shitty transition for someone like me working mostly with text and written music. It was as if the world had decided that the use case for computer screens was something else than what I was using them for. Although I do costume some video content on my computer today, I would switch to a high-res 4:3 display instantly.

      A 27" 4000x3000 display would be a dream come true.

  • I'm waiting for the same thing. Hoping LG actually has this in the works, as their current 34" 5120 x 2160 has been out of stock for a while now, and they make a bunch of other high refresh rate displays.

  • I doubt you'll find any gaming monitors at that resolution for a while. The newest Nvidia GPUs can't hit high enough frame rates at 4k to utilize 120Hz.

    Why do you need 120Hz+ for productivity?

    • 1. With DLSS 2.0 you can get 120+ fps in a lot of things on the 3080 @ 4k

      2. No one need do anything except die, but shaving a few ms of response time is nice for productivity. No one thing is critical but making sure you have a keyboard, mouse, monitor, refresh rate, and programs that aren't throwing latency out the window makes for an overall nice feeling system.

    • My understanding is that if you aren't gaming (i.e. most of the content on your screen is static) your GPU is smart enough not to redraw the entire frame so you can probably get very low latency refreshes.

    • Even without DLSS, this isn’t necessarily true. My 3080 has no problems pegging some recent games at 144-160 fps at the corresponding refresh rate. In fact, I’m generally CPU bottlenecked at 1440p or below.

I bought a 165Hz gsync screen years ago for gaming.

I bought all of my other high-refresh screens because wow computing feels so much better in the day-to-day desktop because of it! Not joking in the least.

  • My brother swears by his 140hz display, I have used it for a bit and came away feeling underwhelmed - I prefer my 4K 10bit 60hz display

    • I can understand if you need the pixels and bit depth for color work. I just work with sRGB, and the panel I have is fine for that. I really do like the speed, though.

But the text isn't making it from your keyboard through the machine and to the monitor any faster. Seems like at 120hz you're just maybe going to catch it a frame earlier when it shows up.

  • It is, because there may be multiple things synchronizing their inputs and outputs to the refresh, which causes the refresh-related latency to be a number of frames. E.g. in some Linux compositors inputs are latched with the refresh, apps themselves may render in sync, and the compositor also draws in sync; the GPU driver would also introduce at least a frame of latency typically.

    Another factor is that some (many?) 60 Hz displays buffer a whole frame themselves, and often don't have quick response times. If you go from a 10 ms response time IPS screen with a frame buffer to a 120 Hz gaming screen with 2-3 ms response time, you already got a difference of about 25 ms just in the screen itself.

    8 ms is hard to notice. 50 ms less so.

    The difference is pretty huge, even on systems that are much better tuned than Linux desktops (e.g. Windows 10).

    That being said, while it is very nice and feels nice, it's not necessary for development work; I spend most of my days developing on a system over a VNC connection through a VPN, so the basic input lag of that setup is around 200-300 ms. Gnarly yes, but not particularly bad for text input. You get used to just do everything very slowly with the mouse.

    • > The difference is pretty huge, even on systems that are much better tuned than Linux desktops (e.g. Windows 10).

      What would you suggest, or either Windows 10 or Linux, to get the lowest latency in a terminal?

      3 replies →

  • In the article: > We get a 90 ms improvement from going from 24 Hz to 165 Hz.

    As per the linked article, the observed delay improvement isn't one frame, more like ~3 frames. It doesn't go up from 1/24 to 1/165, it goes from 2.5/24 to 2.5/165.

    Computer software waits a lot more than it did in 1977. That's why 240hz displays feel much more snappier even if it's supposed to be less noticeable -- you're waiting for same 3 frames, but they pass by much faster.

  • I guess so, the difference between 8ms vs 16ms is not much for 1 frame.

    Though in my totally subjective experience it feels better.

    Interestingly the person who did this latency test also did a keyboard latency test:

    https://danluu.com/keyboard-latency/

    Compared to the slowest keyboard measured it's possible to shave 45ms which if you were latency sensitive would be the biggest reduction.

    • > A major source of latency is key travel time. It’s not a coincidence that the quickest keyboard measured also has the shortest key travel distance by a large margin. ... Note that, unlike the other measurement I was able to find online, this measurement was from the start of the keypress instead of the switch activation.

      That's disappointing. He isn't measuring latency nearly as much as he's measuring point of actuation.

      3 replies →

And on the other side: I very much notice the difference between a corded mouse and a Bluetooth mouse (not a cheap one). It's not unusable, but frustrating enough that I prefer the corded ones. And I'm not a gamer, just doing office stuff, browsing and programming.

  • I have tried like 5 Bluetooth mice, since BL4.0 the latency has improved from 'terrible' to 'tolerable'. It might be something to do with the windows bluetooth stack, as i noticed the respobce was more sluggish during high CPU use, while USB mice seem unaffected.

    The same logitech mouse performs much faster through their universal wireless adapter than trhough bluetooth (it has 2 modes), they also have lightspeed adapter, but I haven't noticed much difference.

  • If you haven't already, get a high refresh rate monitor and a 1000 reports/sec "gaming" mouse. The Razer Viper Mini is light, which adds to the feeling of responsiveness. I've bought a couple for other people and they love them.

    I recently got a 165 Hz monitor for my decade-old PC (Sandy Bridge era) and with my (now old) G302 mouse, it's like having a new, much faster PC.

Maybe there is something there about IDE hints and whatnot too rendering faster, but those are usually bottlenecked by some async background work (language servers, linters, etc.) anyway which would negate the benefit of the screen refresh rate being faster

This is also can be enhanced by tweaking up your mouse sample rates and whatnot. I used to do a lot of setting PS/2 mouse sample rates at 80Hz or more in late 90s that made Windows machines feel almost as nice to use as a Mac.

  • Isn't PS/2 interrupt based, therefore instantaneous? I think USB has 125 Hz polling rate by default which can be increased to 1000 Hz.

    • From Wikipedia: -

      > The interface has two main signal lines, Data and Clock. ... To transmit a byte, the device simply outputs a serial frame of data (including 8 bits of data and a parity bit) on the Data line serially as it toggles the Clock line once for each bit.

      Increasing the clock rate absolutely does reduce latency on a PS/2 port.

      1. https://en.wikipedia.org/wiki/PS/2_port

Yeah, this is my favorite gimmick of my Samsung Galaxy S20. 120Hz display+240Hz touch sensor makes everything feels so fast and responsive, this is the first Android device that feels faster than my iPad.

  • Does a 240hz sensor help when the display responds at half that rate?

    I guess it's because the latencies stack on top of each other?

    • It's quite common for the touch panel to run at 2x the display refresh rate. Eg, 120hz touch panels have been quite common for years on phones despite the 60hz disply.

      It helps both slightly with latency directly, but it also gives you more points to sample from for input prediction which then helps with latency even more. Or for things like drawing applications it'll give you smoother curves assuming the drawing app looks at all intermediate touch samples.

Some recent PC/laptop displays can do 100hz and I even feel a difference between that and the long-standard 60hz. It's a small effect but it's noticeable and it looks a bit smoother and better.

Scrolling, yes. Mouse, absolutely. Typing? Not really. I have tested this side-by-side.

  • You may not notice, that doesn't mean nobody does. Also it can depend on your operating system. Windows 10 (maybe everything now?) forces you to be in vsync, so increasing refresh rate has a non trivial improvement on the # of ms for a keypress response.

    Now whether you will notice that can depend what environment you are in. If your editor already has an input latency of 100ms then shaving 8 off probably is not noticeable. But going from 20 to 12 might be.

    • Definitely not everything. Non-compositing window managers on Linux (non-wayland of course) work in the old way.

    • On Windows 10, and I have tested various text input boxes. There's no way latency is anywhere near 100ms, I would notice that.

      Generally, from the tests I've seen, input latency is 30-40ms on a wired keyboard on Windows.

      1 reply →

the rate of persistence of the human eye is around 25Hz, hence PAL and NTSC framerates. the upper bound for those hypersensitive is around 50Hz, hence the later generation monitors. anything above this is nothing more than marketing hype. it seeming smoother is merely a placebo to justify the extra cost

  • Countless blind tests have shown a noticeable difference up to refresh rates over 100hz (and potentially greater). This is the first of many examples that I found: https://techreport.com/news/25051/blind-test-suggests-gamers...

    "The human eye can't tell the difference past 30 FPS" was literally just a thought-killing cliche repeated by console gamers getting into internet slapfights with PC gamers.

    You can see the difference for yourself here on any 60hz monitor: http://www.30vs60fps.com/

    • While the difference is noticeable in that 30 vs 60 Hz example, high-contrast foreground/background scrolling is an even starker example: https://www.testufo.com/framerates-text. Just be warned high contrast examples like that can have confounding factors on some displays due to ghosting.

    • Corridor Crew did a fun and interesting video[1] on how good the eye is.

      In it, they have a segment on frame rate[2], where he mentions that the neurons can only fire about every 13ms, or about 75 FPS. But, the important point, they're not in sync like a computer screen is. This means the effective update rate for a group of neurons can be much less.

      [1]: https://youtu.be/sPpAXMH5Upo

      [2]: https://youtu.be/sPpAXMH5Upo?t=252

  • This is not correct. Let's not dispute and go with your claim of 25Hz.

    Even then, the problem is that eyes don't work like cameras or monitors. Our eyes don't work with "frames". Each receptor updates on its own time. It's easy to see that, if the updates are staggered, multiple sensors could perceive higher frame-rates, even if they can't individually.

    However, there's another angle to this. Disregarding input lag, which is a very real phenomenon and is greatly shortened by higher refresh, higher refresh rate monitors are able to show more 'discrete' steps for anything that's in motion. Our eyes (and brains) perceive this as movement 'smoothness', even if they can't quite make out every single frame that's displayed.

    You should try that yourself. Do a blind test.

  • This is very incorrect. With video, every person can tell the difference between 30hz and 60hz, if they at least know what to look for.

    The easiest way to show this is by wiggling your mouse around quickly on a computer screen.

    At 30hz - you'll see the mouse teleporting around - hopping from spot to spot, but not moving. For example, if you stare in the middle, and jerk the mouse quickly to the right, you'll see the 4 or 5 spots where it rendered.

    With 60hz, you'll see the 9 or 10 spots - and have a stronger illusion of movement.

    With 120hz, it might even look as smooth as a real object flying across your screen.

    • I'm wondering if this is why I find 120hz so jarring. It's almost as if I can't perceive the 'hopping' consciously, but I can subconsciously. It feels like my brain is asking "Wait, how did you get over there?" where "over there" is a fraction of a millimeter. I think maybe there is an uncanny valley for motion.

  • Sorry you're wrong.

    It's as absurd as saying it's impossible to tell the difference between a 55" 720p display compared to a 4k one at a distance of 1 foot away.

  • Are you claiming humans would fail an ABX test between a 60Hz and a 120Hz monitor? That sounds like a pretty extraordinary claim.

  • So does it seem smoother or does it not? I definitely notice the difference with 60fps video and 120Hz+ monitors.

    Additionally, another thing I've noticed in the last decade is the very badly PWM-frequency-tuned LED headlights on some cars. Those engineers selected the wrong LED brightness and tuned it to terrible frequencies which leave flicker-trails when you look at or away from them.

    Get those PWM frequencies above 500Hz please! Especially get above 200Hz with your LED frequencies at the very least.

  • ~25Hz is the _lower bound_ for video* to appear smooth, _as long as there is motion blur_. But user interfaces do not add motion blur to moving objects. A computer monitor at 25Hz would be horrendous to use.

    Lower framerates are less noticeable in low light, which is another reason why films look acceptable.

    *Talented animators/cartoonists can get away with lower framerates