Comment by dehrmann
7 months ago
> rapid advances in chipmaking
Maybe it's a nit, but the advances are in how the chips access memory and are networked together.
> In reality, not all of the AI quintet’s servers would be useless after three years, let alone 12 months. They can keep performing oodles of non-AI work.
Not really. The AI servers are essentially useless for non-AI work.
software can follow compute.. as gpus become cheaply available in the cloud, there's more and more pressure for software to take advantage of that
My understanding is GPUs aren't general purpose. If you have to resort to setting a vast network of cellular atomata to do non-GPU workloads, those servers will get trounced by a raspberry pi that costs pennies in power.
Yes But CPU and GPU diverged since forever because they do not solve the same set of problems
Can I run k8s on a GPU ? Yes, why not. Will it be efficient ? No.
(replace k8s by "whatever random code you are mostly running")
So we outsource the serial calculations to the cloud but handle branching on our cpus?