Comment by zozbot234
7 hours ago
Nope, the magic pixie dust language that was supposed to run Python-like code on GPU was Mojo /s
this is really just leveraging Rust's existing, unique fit across HPC/numerics, embedded programming, low-level systems programming and even old retro-computing targets, and trying to expand that fit to the GPU by leveraging broad characteristics that are quite unique to Rust and are absolutely relevant among most/all of those areas.
The real GPU pixie dust is called "lots of slow but efficient compute units", "barrel processing", "VRAM/HBM" and "non-flat address space(s) with explicit local memories". And of course "wide SIMD+SPMD[0]" which is the part you already mentioned and is in fact somewhat harder to target other than in special cases (though neural inference absolutely relies on it!). But never mind that. A lot of existing CPU code that's currently bottlenecked on memory access throughput will absolutely benefit from being seamlessly ran on GPU.
[0] SPMD is the proper established name for what people casually call SIMT
No comments yet
Contribute on Hacker News ↗