Comment by elabajaba
3 years ago
I trust their measurements, I just don't like how they score things, and people tend to just use their scores instead of looking at the pros+cons and measurements. (they weight all the different subscores, and add them up, so eg. if there was an excellent monitor except it had a 100:1 contrast ratio, it'd still get great scores despite having such a huge flaw that most people would consider it to be essentially unusable).
It's really bad for HDR monitors, where an edge lit "fake HDR" monitor can get a 7, while failing the basics that are necessary to give a proper HDR experience. Something like TFTCentral or HardwareUnboxed's HDR checklists, and just straight up failing monitors that don't meet all the requirements would be much better than their current (imo misleading) system that can give good SDR monitors high HDR scores, when they're terrible at HDR.
Sounds like those basic components should be weighted more heavily, then?
Not really. If any single category is "good enough" then the weights are reasonably correct. It's just when a single category is a deal breaker that the simple metric of adding them up doesn't work.