← Back to context

Comment by ceroxylon

12 hours ago

It obviously leans more into the theories around how things can go horribly wrong if "AGI" is actually a thing (which can get existentially exhausting), but is still worth contemplating.

Hank Green had an interesting discussion with one of the authors, Nate Soares: https://www.youtube.com/watch?v=5CKuiuc5cJM

side-note: i wager that the engagement bait title of the video was generated by AI, which is humorous to me in this context