One possible reason: to achieve the performance improvements, we are seeing more integrated and soldered-together stuff, limiting later upgrades. The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
If the product succeeds and the market starts saying that this is acceptable for desktops, I could see more and more systems going that way to get either maximum performance (in workstations) or space/power optimisation (e.g. N100-based systems). Then other manufacturers not optimising for either of these things might start shipping soldered-together systems just to get the BoM costs down.
> The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
No need to pick on Framework here, AMD could not make the chip work with replaceable memory. How many GPUs with user replaceable (slotted) memory are there? Zero snark intended
That’s a laptop. It’s soldered for space constraints.
There are high speed memory module form factors. It just adds thickness, cost, expense, and they’re not widely available yet.
Most use cases need the high speed RAM attached to the GPU, though. Desktop CPUs are still on 2-channel memory and it’s fine. Server configs go to 12-channel or more, but desktop hasn’t even begun to crack the higher bandwidth because it’s not all that useful compared to spending the money on a GPU that will blow the CPU away anyway.
The only market for desktops is gaming. Hence nvidia will just slap a cpu on their board and use the unified memory model to sell you an all in one solution. Essentially a desktop console.
Maybe some modularization will survive for slow storage. But other than that demand for modular desktops is dead.
Cases will probably survive since gamers love flashy rigs.
There are a handful of professional uses for a workstation that are hard to beat with a laptop.
If you're compiling code, you generally want as much concurrency as you can get, as well as great single core speed when the task parallelism runs out. There aren't really any laptops with high core counts, and even when you have something with horsepower, you run into thermal limits. You can try and make do with remoting into a machine with more cores, but then you're not really using your laptop, it might as well be a Chromebook.
> There are a handful of professional uses for a workstation
I've historically built my own workstations. My premise is that my most recent build may be my last or second to last. In ten years, I will still have a workstation - but not one that I build from parts.
All of these can be done much better on the cloud (I can spawn as big of a machine as my pocket can afford). And with today’s tooling (vs code & jetbrains remote development) you don’t even notice that you develop on a remote machine and not your local.
So the desktop developer market is for those who are not willing to use cloud. And this is a very small minority.
(FYI I am not endorsing cloud over local development, I just state where the market is)
I disagree. My premise isn't that desktops are going away. It's that DIY custom-build desktops are destined for the trash heap of history since you'll no longer be able to buy CPUs and memory. We will be buying desktops like the HP Z2 Mini Workstation - or the 10 years from now equivalent.
>Cases will probably survive since gamers love flashy rigs
But only as a retro theme thing? Would enthusiasts just put a Z2 Mini, for example, inside the case, wire up the lights, and call it a day?
There is still lot of productivity stuff that benefits from power of desktops. Engineering (Ansys etc), local AI development, 3D modeling, working with large C++/Rust codebases, scientific computing, etc etc. And related to gaming there is of course the huge game developer market too. There is a reason why nvidia and amd still make workstation class GPUs for big bucks.
But all of that hinges on fast off-chip memory. If manufacturers agree that this memory and the SoC need to be soldered, there's not much left to swap out except PCIe boards.
While that is mostly true there is a large variety of motherboards. It took me a while to find something with the right SATA and PCIE slots that I wanted. But after that it is just using a screwdriver and some cable ties.
Apple's done it since 2020. Intel was planning to, but walked it back. It dramatically increases performance, and allows vendors to sell you RAM at 8x the market price, and requires you to replace your entire computer to upgrade it, thereby inducing you to overspend on RAM from the outset so that you don't have to spend even more to replace the entire system later.
There's literally no reason for shareholders not to demand this from every computer manufacturer. Pay up, piggie.
One possible reason: to achieve the performance improvements, we are seeing more integrated and soldered-together stuff, limiting later upgrades. The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
If the product succeeds and the market starts saying that this is acceptable for desktops, I could see more and more systems going that way to get either maximum performance (in workstations) or space/power optimisation (e.g. N100-based systems). Then other manufacturers not optimising for either of these things might start shipping soldered-together systems just to get the BoM costs down.
> The Framework Desktop from the modular, user-upgradeable laptop company, has soldered-on memory because they had to "choose two" between memory bus performance, system stability, and user-replaceable memory modules.
No need to pick on Framework here, AMD could not make the chip work with replaceable memory. How many GPUs with user replaceable (slotted) memory are there? Zero snark intended
That’s a laptop. It’s soldered for space constraints.
There are high speed memory module form factors. It just adds thickness, cost, expense, and they’re not widely available yet.
Most use cases need the high speed RAM attached to the GPU, though. Desktop CPUs are still on 2-channel memory and it’s fine. Server configs go to 12-channel or more, but desktop hasn’t even begun to crack the higher bandwidth because it’s not all that useful compared to spending the money on a GPU that will blow the CPU away anyway.
I'm pretty sure the "Framework Desktop" is a desktop, not a laptop.
The Framework Desktop is not a laptop. The clue is in the name...
https://frame.work/gb/en/desktop
The only market for desktops is gaming. Hence nvidia will just slap a cpu on their board and use the unified memory model to sell you an all in one solution. Essentially a desktop console.
Maybe some modularization will survive for slow storage. But other than that demand for modular desktops is dead.
Cases will probably survive since gamers love flashy rigs.
There are a handful of professional uses for a workstation that are hard to beat with a laptop.
If you're compiling code, you generally want as much concurrency as you can get, as well as great single core speed when the task parallelism runs out. There aren't really any laptops with high core counts, and even when you have something with horsepower, you run into thermal limits. You can try and make do with remoting into a machine with more cores, but then you're not really using your laptop, it might as well be a Chromebook.
> There are a handful of professional uses for a workstation
I've historically built my own workstations. My premise is that my most recent build may be my last or second to last. In ten years, I will still have a workstation - but not one that I build from parts.
1 reply →
All of these can be done much better on the cloud (I can spawn as big of a machine as my pocket can afford). And with today’s tooling (vs code & jetbrains remote development) you don’t even notice that you develop on a remote machine and not your local.
So the desktop developer market is for those who are not willing to use cloud. And this is a very small minority.
(FYI I am not endorsing cloud over local development, I just state where the market is)
7 replies →
>The only market for desktops is gaming.
I disagree. My premise isn't that desktops are going away. It's that DIY custom-build desktops are destined for the trash heap of history since you'll no longer be able to buy CPUs and memory. We will be buying desktops like the HP Z2 Mini Workstation - or the 10 years from now equivalent.
>Cases will probably survive since gamers love flashy rigs
But only as a retro theme thing? Would enthusiasts just put a Z2 Mini, for example, inside the case, wire up the lights, and call it a day?
There is still lot of productivity stuff that benefits from power of desktops. Engineering (Ansys etc), local AI development, 3D modeling, working with large C++/Rust codebases, scientific computing, etc etc. And related to gaming there is of course the huge game developer market too. There is a reason why nvidia and amd still make workstation class GPUs for big bucks.
But all of that hinges on fast off-chip memory. If manufacturers agree that this memory and the SoC need to be soldered, there's not much left to swap out except PCIe boards.
If the processor comes with builtin GPU, NPU and RAM will you be really building the system
Sure. Building a PC already is barely building anything. You buy a handful of components and click them into each other.
While that is mostly true there is a large variety of motherboards. It took me a while to find something with the right SATA and PCIE slots that I wanted. But after that it is just using a screwdriver and some cable ties.
A lot of flexibility still exists
Yes, as that's already the case with phones. There is more to a phone than the SOC.
Who builds phones?
1 reply →
RAM? Are we expecting on-chip RAM any time soon?
Apple's done it since 2020. Intel was planning to, but walked it back. It dramatically increases performance, and allows vendors to sell you RAM at 8x the market price, and requires you to replace your entire computer to upgrade it, thereby inducing you to overspend on RAM from the outset so that you don't have to spend even more to replace the entire system later.
There's literally no reason for shareholders not to demand this from every computer manufacturer. Pay up, piggie.
3 replies →