Comment by dannyobrien
5 hours ago
So, this is not quite right: Alexander contributed to the report, but his personal opinion is more like the mid-2030s[1]. Freddie feels like this is him backing down from the original statement, but in fact he said this at the time the report was published, and in fact pointed out a graf below the quote that Freddie claims does tie him to 2027:
> Do we really think things will move this fast? Sort of no - between the beginning of the project last summer and the present, Daniel’s median for the intelligence explosion shifted from 2027 to 2028. We keep the scenario centered around 2027 because it’s still his modal prediction (and because it would be annoying to change). Other members of the team (including me) have medians later in the 2020s or early 2030s, and also think automation will progress more slowly. So maybe think of this as a vision of what an 80th percentile fast scenario looks like - not our precise median, but also not something we feel safe ruling out. [2]
I don't think this changes your observation that he is "personally invested" (i.e. believes this trendline will continue), but I'm pretty sure when AGI doesn't appear in 2027, many people will believe that this invalidates the arguments being made here (or in the report). The actual report was intended to give a feel for what a near-future "disaster" AGI scenario, and settled on a date to give that some concrete immediacy. The collective review that gave that as a possible, but not inevitable date is still ongoing (they originally pushed their best estimate out a bit further, but now they think, judging by the goals that are being hit, their scenario was a little too conservative). [3]
[1] https://freddiedeboer.substack.com/p/im-offering-scott-alexa... [2] https://www.astralcodexten.com/p/introducing-ai-2027 [3] https://blog.aifutures.org/p/grading-ai-2027s-2025-predictio...
No comments yet
Contribute on Hacker News ↗