← Back to context

Comment by DirkH

6 months ago

I've yet to meet anyone who thinks it a certainty in the community. Even my short-timeline friends who don't contribute to retirement savings and otherwise have clearly seen their mental health deteriorate over the stress still don't think it a certainty at all. Just something they reason as likely enough that it is reasonable to stress out over and change life plans over.

I don't think it was dumb of scientists during the nuclear arms race in the 50s to think there was a significant enough chance humans were going to go extinct soon (years, not decades) because of the developments they were seeing and as a result getting very anxious and doing things like stopping their retirement savings plans.

The fears from that era were 100% well-founded within the group or scientists who were most on the know about developments. Same goes for AI development.

Talking to people I have met who used to work at OpenAI as AI researchers and left because of AI safety concerns paints a very clear "these are reasonable-headed people with well-informed beliefs" when they say things like there is a small but significant chance we lose control by 2027. Because humans are ganna human and place shareholder value and beating China over safety.