← Back to context

Comment by Davidzheng

2 days ago

Many people will have to ask themselves these question soon regardless of their actions. I don't understand the critique here.

It's more like just pondering out loud how automating ourselves out of a job in an economic system that requires us to have a job is going to pan out for the large majority of people in the coming years.

  • As someone who has been pondering this very question since 2015, I'm starting to think we have been:

    - underestimating how much range humans have in their intelligence and how important it is to productivity.

    - overestimating how close LLMs are to replicating that range and underestimating how hard it will be for AI to reach it

    - underestimating human capacity to become dissatisfied and invent more work for people to do

    - underestimating unmet demand for the work people are doing that LLMs can make orders of magnitude more efficient

    I was pretty convinced of the whole "post scarcity" singularity U mindset up until the last year or two... My confidence is low, but I'm now leaning more towards jevins paradox abound and a very slow super intelligence takeoff with more time for the economy to adapt.

    The shift in my view has come from spending thousands of hours working with LLMs to code and building applications powered by LLMs, trying to get them to do things and constantly running into their limitations, and noting how the boundary of their limitations have been changing over time. (Looks more like S-curve to me than exponential takeoff). Also some recent interviews by some of the leading researchers, and spending a few hundred hours studying the architecture of human brain and theories regarding intelligence.