Comment by intrasight
1 day ago
Honest question: Will building a high-end PC still be a thing in 10 years? I've built all of mine in the last 20 years. Just finished my first AMD build. But I don't think it'll be possible or allowed after a few more CPU iterations. Sure, you'll be able to do builds with the CPU tech available up to when it stops, but I seriously doubt that the cutting-edge chip tech ten years hence will be available to hobbyists. Tell me why I'm wrong.
I think this hinges on what one considers "cutting edge CPU tech". Is it "newer and better CPU tech than before" or "the highest end CPU tech of the particular day".
If the latter ("the highest end CPU tech of the particular day"), I think it's going to keep getting harder and harder, with more top end options like the M4 Max being "prebuilt only", but I don't think it'll go to 0 options in as short as 10 years from now.
If the former ("newer and better CPU tech than before") I think it'll last even longer than the above, if not indefinitely, just because technology will likely continue to grow consistently enough that even serving a small niche better than before will always eventually be a reasonable target market despite what is considered mainstream.
You're going to have to unpack "allowed". Are you saying that the Apple model will win so heavily that separate parts will not be available? What change are you expecting?
NVIDIA not selling cutting edge other than in bulk is a phenomenon of the AI bubble, which will eventually deflate. (I'm not saying it will go away, just that the massive training investments are unsustainable without eventually revenue catching up)
no, you tell us why you think the next ten years are going to be different than the last thirty
One possible reason: to achieve the performance improvements, we are seeing more integrated and soldered-together stuff, limiting later upgrades. The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
If the product succeeds and the market starts saying that this is acceptable for desktops, I could see more and more systems going that way to get either maximum performance (in workstations) or space/power optimisation (e.g. N100-based systems). Then other manufacturers not optimising for either of these things might start shipping soldered-together systems just to get the BoM costs down.
> The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
No need to pick on Framework here, AMD could not make the chip work with replaceable memory. How many GPUs with user replaceable (slotted) memory are there? Zero snark intended
That’s a laptop. It’s soldered for space constraints.
There are high speed memory module form factors. It just adds thickness, cost, expense, and they’re not widely available yet.
Most use cases need the high speed RAM attached to the GPU, though. Desktop CPUs are still on 2-channel memory and it’s fine. Server configs go to 12-channel or more, but desktop hasn’t even begun to crack the higher bandwidth because it’s not all that useful compared to spending the money on a GPU that will blow the CPU away anyway.
2 replies →
The only market for desktops is gaming. Hence nvidia will just slap a cpu on their board and use the unified memory model to sell you an all in one solution. Essentially a desktop console.
Maybe some modularization will survive for slow storage. But other than that demand for modular desktops is dead.
Cases will probably survive since gamers love flashy rigs.
There are a handful of professional uses for a workstation that are hard to beat with a laptop.
If you're compiling code, you generally want as much concurrency as you can get, as well as great single core speed when the task parallelism runs out. There aren't really any laptops with high core counts, and even when you have something with horsepower, you run into thermal limits. You can try and make do with remoting into a machine with more cores, but then you're not really using your laptop, it might as well be a Chromebook.
10 replies →
>The only market for desktops is gaming.
I disagree. My premise isn't that desktops are going away. It's that DIY custom-build desktops are destined for the trash heap of history since you'll no longer be able to buy CPUs and memory. We will be buying desktops like the HP Z2 Mini Workstation - or the 10 years from now equivalent.
>Cases will probably survive since gamers love flashy rigs
But only as a retro theme thing? Would enthusiasts just put a Z2 Mini, for example, inside the case, wire up the lights, and call it a day?
There is still lot of productivity stuff that benefits from power of desktops. Engineering (Ansys etc), local AI development, 3D modeling, working with large C++/Rust codebases, scientific computing, etc etc. And related to gaming there is of course the huge game developer market too. There is a reason why nvidia and amd still make workstation class GPUs for big bucks.
1 reply →
If the processor comes with builtin GPU, NPU and RAM will you be really building the system
Sure. Building a PC already is barely building anything. You buy a handful of components and click them into each other.
2 replies →
RAM? Are we expecting on-chip RAM any time soon?
3 replies →
Yes, as that's already the case with phones. There is more to a phone than the SOC.
2 replies →