Comment by dgfitz
1 day ago
It’s funny how we seem to be on this treadmill of “tech that uses GPUs to crunch data” starting with the Bitcoin thing, moving to NFTs, now LLMs.
Wonder what’s next.
1 day ago
It’s funny how we seem to be on this treadmill of “tech that uses GPUs to crunch data” starting with the Bitcoin thing, moving to NFTs, now LLMs.
Wonder what’s next.
The twilight of Moore's law and diminishing returns for CPUs are driving a shift to GPUs and other accelerators. GPUs seem to do well for streaming/throughput type workloads.
What's interesting that Nvidia has managed to ride each of these bubbles so far.
Accelerating the calculations done in probabilistic programming languages.
Any evidence this can be done, research literature-wise?
just a few picks in no particular order
https://pmc.ncbi.nlm.nih.gov/articles/PMC2945379/
https://arxiv.org/abs/2202.11154
https://lips.cs.princeton.edu/pdfs/liu2024generative.pdf
https://arxiv.org/abs/2501.05440
1 reply →
GPUs are incidental here, a dedicated SIMD coprocessor for matrix multiplication would be much better. We'll ditch GPUs eventually.