Comment by mikepurvis
1 day ago
They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.
Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.
I think you're underestimating the computing power required to render (natively) at 4K. Some modern games can't even natively render at 1440p on high-end PCs.
Thats a waste of image quality for most people. You have to sit very close to a 4k display to be able to perceive the full resolution. On PC you could be 2 feet from a huge gaming monitor, but an extremely small percentage of console players have the tv size and distance ratio where they would get much out of full 4k. Much better to spend the compute on higher framerate or higher detail settings.
> You have to sit very close to a 4k display to be able to perceive the full resolution.
Wait, are you sure you don't have that backward? IIUC, you don't[] notice the difference between a 2K display and a 4K display until you get up to larger screen sizes (say 60+ inches give or take a dozen inches; I don't have exact numbers :) ) and with those the optimal viewing range is like 4-8 feet away (depending on the screen size).
Either that or am I missing something...
[]Generally, anyway. A 4K resolution should definitely be visible at 1-2 feet away as noticeably crisper, but only slightly.
3 replies →
I think higher detail is where most of it goes. A lower resolution, upscaled image of a detailed scene, at medium framerate reads to most normal people as "better" than a less-detailed scene rendered at native 4k, especially when it's in motion.
Assuming you can render natively at high FPS, 4k makes a bigger difference on rendered images than live action because it essentially brute forces antialiasing.
1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity. You can argue that 1440p is a genuine (slight) improvement for super crisp text, but not for a game. HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
You sounded like someone who doesn’t have 1440p or 2160p.
I have a 77’ S95D and my 1080p Switch looked horrible. Try it also with a 1080p screen bigger than 27 inch.
I also have a 77” OLED and there’s no question that 4K content is noticeably better looking on it that 1080p content.
> 1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity.
Wow, what a load of bullshit. I bet you also think the human eye can't see more than 30 fps?
If you're sitting 15+ feet away from your screen, yeah, you can't tell the difference. But for most people, with their eyes only being 2-3 feet away from their monitor, the difference is absolutely noticeable.
> HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
HDR is an absolute game-changer, for sure. Ray-tracing is as well, especially once you learn to notice the artifacts created by shortcuts required to get reflections in raster-based rendering. It's like bad kerning. Something you never noticed before will suddenly stick out like a sore thumb and will bother the hell out of you.
Text rendering alone makes it worthwhile. 1080p densities are not high enough to render text accurately without artefacts. If you double pixel density, then it becomes (mostly) possible to renderi text weight accurately, and things like "rythm" and "density" which were things that real typographers concerned themselves with start to become apparent.
I'm sorry, you need to go to an optician. I can see the pixels at a comfortable distance at 1440p.
Alternatively, you play modern games with incredibly blurry AA solutions. Try looking at something older from when AA actually worked.
You're probably looking up close at a small portion of the screen - you'll always be able to "see the pixels" in that situation. If you sit far back enough to keep the whole of the screen comfortably in your visual field, the argument applies.
You are absolutely wrong on this subject. Importantly, what matters is PPI, not resolution. 1080P would look like crap in a movie theater or on a 55" TV, for example, while it'll look amazing on a 7" monitor.