← Back to context

Comment by swyx

4 hours ago

somebody said once we are mining "low-background tokens" like we are mining low-background (radiation) steel post WW2 and i couldnt shake the concept out of my head

(wrote up in https://www.latent.space/i/139368545/the-concept-of-low-back... - but ironically repeating something somebody else said online is kinda what i'm willingly participating in, and it's unclear why human-origin tokens should be that much higher signal than ai-origin ones)

Low background steel is no longer necessary.

"...began to fall in 1963, when the Partial Nuclear Test Ban Treaty was enacted, and by 2008 it had decreased to only 0.005 mSv/yr above natural levels. This has made special low-background steel no longer necessary for most radiation-sensitive uses, as new steel now has a low enough radioactive signature."

https://en.wikipedia.org/wiki/Low-background_steel

every human generation built upon the slop of the previous one

but we appreciated that, we called it "standing on the shoulders of giants"

  • > we called it "standing on the shoulders of giants"

    We do not see nearly so far though.

    Because these days we are standing on the shoulders of giants that have been put into a blender and ground down into a slippery pink paste and levelled out to a statistically typical 7.3mm high layer of goo.

    • The secret is you then have to heat up that goo. When the temperature gets high enough things get interesting again.

  • We have two optimization mechanisms though which reduce noise with respect to the optimization functions: evolution and science. They are implicitly part of "standing on the shoulders of giants", you pick the giant to stand on (or it is picked for you).

    Whether or not the optimization functions align with human survival, and thus our whole existence is not a slop, we're about to find out.

  • Nothing conveys better the idea of a solid foundation to build upon than the word ‘slop’.

  • Because the pyramids, the theory of general relativity and the Linux kernel are all totally comparable to ChatGPT output. /s

    Why is anybody still surprised that the AI bubble made it that big?