← Back to context

Comment by goatlover

10 days ago

Which would also mean the accelerationists are potentially putting everyone at risk. I'd think a soft takeoff decades in the future would give us a much better chance of building the necessary safeguards and reorganizing society accordingly.

This is a soft takeoff

We, the people actually building it, have been discussing it for decades

I started reading Kurzweil in the early 90s

If you’re not up to speed that’s your fault

  • Decades from now. Society is nowhere near ready for a singularity. The AI we have now, as far as it has come, is still a tool for humans to use. It's more Augmented Intelligence than AGI.

    A hard takeoff would be the tool bootstrapping itself into an autonomous self-improving ASI in a short amount of time.

    And I read Kurzweil years ago too. He thought reverse engineering the human brain once the hardware was powerful enough would together give us the singularity in 2045. And the Turing Test would have been passed by 2029, but seems like LLMs have already accomplished this.

    • 20% of the human population still is not using the internet

      Imagine you’re 70 years old, in rural North Carolina, sitting on your porch wondering why your house has a sheet of ice on it that’s never happened before. Now your already weak soybean harvest that year yields only 30%

      Meanwhile your 30-year-old neighbor just had a productive soybean harvest because they covered their crops prior to the freeze based on using the Internet for weather forecasting

      That trivial variation between people who utilize information technology to improve their survivability has been happening for the last few hundred years unabated.

      This is what the soft singularity looks like