Comment by transpute
1 year ago
> why would i ever use this thing
Los Alamos U.S. National Lab is funding Legion, which uses Terra, https://legion.stanford.edu/overview/
Achieving high performance and power efficiency on future architectures will require programming systems capable of reasoning about the structure of program data to facilitate efficient placement and movement of data.
Are the Stanford researchers in a basement? The lab's previous work lead to CUDA. Does that earn them any consideration? How about lanl.gov using the language?
> Are the Stanford researchers in a basement?
I don't know how to explain this to you because you and everyone else around here worships at the alter of academia (especially HYPSM academia) but the answer is absolutely 100% yes. There is literally nothing coming out of any edu lab, even stanford, that has any relevance in this same industry (ML/AI compilers). To wit: there is no company running absolutely anything on top of legion.
> The lab's previous work lead to CUDA
This is like saying a sketch of a car led to the car. No. CUDA is the result of thousands of working engineers years labor, not academia, not academic aspirational hacking.
> How about lanl.gov using the language?
"using the language". using how? every single group at lanl is using the language? they've migrated all of their fortran to using the language? they're starting new graduate programs based on the language? you really have no clue how low the barrier to entry here is - any undergrad can start an association with lanl (or any other national lab) and just start writing code in whatever language they want and suddenly "lanl is using the language". you don't believe? me go look for any number of reports/papers on julia/go/python/javascript/etc/etc/etc coming out of lanl and ornl and jpl and etc. how do i know this? i'm personally on a paper written at fermilab based primarily around task scheduling in JS lolol.
> there is no company running absolutely anything on top of legion
Does Nvidia count?
> This is like saying a sketch of a car led to the car.
The ACM Turing Award (computer science equivalent of Nobel Prize) committee believes otherwise.
From SIGGRAPH 2004 slides, Stanford's research on Brook was sponsored by Nvidia, ATI, IBM, Sony, U.S. DARPA and DOE. [3]
[1] https://web.archive.org/web/20241212055421/https://www.lanl....
[2] https://amturing.acm.org/award_winners/hanrahan_4652251.cfm
[3] https://graphics.stanford.edu/papers/brookgpu/buck.Brook.pdf
> > there is no company running absolutely anything on top of legion > Does Nvidia count?
I don't understand? To prove that NVIDIA ships legion you sent me a link to a LANL post? How does that make sense? Show me a product page on NVIDIA's domain to prove to me that NVIDIA uses this in a product.
> The ACM Turing Award (computer science equivalent of Nobel Prize) committee believes otherwise
You seem to not get what I'm saying: my firm is position is academia doesn't understand absolutely anything in this area. Zero. Zilch. Nada. And absolutely no one in the industry cares either. So given that position, why is this relevant?
The only thing academia is good for is a talent pool of hard-working, smart people. We take those people and then completely retrain them to do actually useful work instead of research nonsense. The vast majority of PhDs coming from academia to industry (yes even from Stanford) literally are horrible software/hardware engineers. Many of them stay that way. The good ones (at least in so far as they care about having a successful career) learn quickly. That's how you get CUDA, which is a product worth a trillion dollars.
Look I've already told you: you workshop at an alter and you've also clearly never worked at a Stanford or an NVIDIA or a LANL. You'll never be convinced because... well I have no idea why people need mythologies to worship.
4 replies →