← Back to context

Comment by brandall10

7 months ago

It's best to look at this as expected value. A top AI research has the potential to bring in a lot more $$ than a top athlete, but of course there is a big risk factor on top of that.

The expected value is itself a random variable, there is always a chance you mischaracterized the underlying distribution. For sports stars the variance in the expected value is extremely small, even if the variance in the sample value is quite large - it might be hard to predict how an individual sports star will do, but there is enough data to get a sense of the overall distribution and identify potential outliers.

For AI researchers pursuing AGI, this variance between distributions is arguably even worse than the distribution between samples - there's no past data whatsoever to build estimates, it's all vibes.

  • We’ve seen $T+ scale impacts from AI over the past few years.

    You can argue the distribution is hard to pin down (hence my note on risk), but let’s not pretend there’s zero precedent.

    If it turns out to be another winter at least it will have been a fucking blizzard.

    • The distribution is merely tricky to pin down when looking at overall AI spend, i.e. these "$T+ scale impacts."

      But the distribution for individual researcher salaries really is pure guesswork. How does the datapoint of "Attention Is All You Need?" fit in to this distribution? The authors had very comfortable Google salaries but certainly not 9-figure contracts. And OpenAI and Anthropic (along with NVIDIA's elevated valuation) are founded on their work.

      2 replies →

If you imagine hard enough, you can expect anything. e.g. Extraordinary Popular Delusions and the Madness of Crowds

  • Sure, but the idea these hires could pay out big is within the realm of actual reality, even if AGI itself remains a pipe dream. It’s not like AI hasn’t already had a massive impact on global commerce and markets.