Comment by colechristensen

1 year ago

Sure, but "4k" is still being used as a differentiator for streaming companies in how much they charge. Even then they serve up some pretty compressed streams where there's room to do less of that for a noticeable notch in quality.

There's of course a limit. The "native" bitrate equivalent of your retina isn't infinite.

Next step though is going to be lightfield displays (each "pixel" is actually a tiny display with a lens that produces "real" 3D images) and I assume that will be a thing, we shall see if it does better than the last generation of 3D TVs/movies/etc. That's a big bump in bitrate.

There's also bitrate for things like game/general computing screen streaming where you need lots of overhead to make the latency work, you can't buffer several seconds of that.

The next gen sci-fi of more integrated sensory experiences is certainly going to be a thing eventually too. Who knows how much information that will need.

When more bandwidth becomes available, new things become possible, sometimes that are hard to imagine before somebody gets bored and tries to figure it out.

When I'm futzing around with ML models, I'm loading tens of gigabytes from disk into memory. Eventually something like that and things orders of magnitude larger will probably be streamed over the network like nothing. PCIe 4.0 x16 is, what 32 GBps? Why not that over a network link for every device in the house in 10 years?