← Back to context

Comment by Kon5ole

7 hours ago

>> You can get 8K TVs for <$1000 now.

>8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture.

I don't think that's true.

I've been using a 8k 55" TV as my main monitor for years now. It was available for sub-800 USD before all such tv's vanished from the market. Smaller pixels were not more expensive even then, the 55"s were the cheapest.

4k monitors can be had for sub-200 usd, selling 4x the area of the same panel should be at most 4x that price. And it was, years ago.

So they were clearly not complicated or expensive to manufacture - but there was no compelling reason for having 8k on a TV so they didn't sell. However, there IS a compelling reason to have 8K on a desktop monitor!

That such monitors sell for 8000 usd+ is IMO a very unfortunate situation caused by a weird incompetence in market segmentation by the monitor makers.

I firmly believe that they could sell 100x as many if they cut the price to 1/10th, which they clearly could do. The market that never appeared for tv's is present among the world's knowledge workers, for sure.

I've been using an 8k 65" TV as a monitor for four years now. When I bought it, you could buy the Samsung QN700B 55" 8k, but at the time it was 50% more than the 65" I bought (TCL).

I wish the 55" 8k TVs still existed (or that the announced 55" 8k monitors were ever shipped). I make do with 65", but it's just a tad too large. I would never switch back to 4k, however.

  • What standard does reliably work to drive 8K at 60 Hz and how expensive cables are?

    How far away do you sit from it? Does it sit on top of your desk? What do you put on all this space, how do you handle it?

    I don’t think you’re maximizing one browser window over all 33 million pixels

    • HDMI 2.1 is required, and the cables are not too expensive now.

      For newer gpus (nvidia 3000+ or equivalent) and high end (or M4+) macs hdmi 2.1 works fine but Linux drivers have some licensing issue that makes hdmi 2.1 problematic.

      It works with certain nvidia drivers but I ended up getting a DP to HDMI 8K cable which was more reliable. I think it could work with AMD and Intel also but I haven't tried.

      In my case I have a 55 and sit normal monitor distance away. I made a "double floor" on my desk and a cutout for the monitor so the monitor legs are some 10cm below the actual desk, and the screen starts basically at the level of the actual desk surface. The gap between the desk panels is nice for keeping usb hubs, drives, headphone amps and such. And the mac mini.

      I usually have reference material windows upper left and right, coding project upper center, coding editor bottom center, and 2 or 4 terminals, teams, slack and mail on either side of the coding window. The center column is about tice as wide as the sides. I also have other layouts depending on the kind of work.

      I use layout arrangers like fancyzones (from powertoys) in windows and a similar mechanism in KDE, and manual window management on the mac.

      I run double scaling, so I get basically 4K desktop area but at retina (ish) resolution. 55 is a bit too big but since I run doubling I can read stuff also in the corners. 50" 8K would be ideal.

      Basically the biggest problem with this setup is it spoils you and it was only available several years ago. :(

  • What is the model number and how has the experience been?

    I've mostly read that TV's don't make great monitors. I have a TLC Mini LED TV which is great as a TV though.

  • What do you watch on an 8K TV?

    There's no content

    Average bitrate from anything not a Bluray for even HD is not good, so you do not benefit from more pixels anyway. Sure, you are decompressing and displaying 8K worth of pixels, but the actual resolution of your content is more like 1080p anyway, especially in color space.

    Normally, games are the place where arbitrarily high pixel counts could shine, because you could literally ensure that every pixel is calculated and make real use of it, but that's actually stupidly hard at 4k and above, so nvidia just told people to eat smeary and AI garbage instead, throwing away the entire point of having a beefy GPU.

    I was even skeptical of 1440p at higher refresh rates, but bought a nice monitor with those specs anyway and was happily surprised with the improvement, but it's obvious diminishing returns.