← Back to context

Comment by kilroy123

7 days ago

I agree with you on all of it.

But _what if_ they work out all of that in the next 2 years and it stops needing constant supervision and intervention? Then what?

It’s literally not possible. It has nothing to do with intelligence. A perfectly intelligent AI still can’t read minds. 1000 people give the same prompt and want 1000 different things. Of course it will need supervision and intervention.

We can synthesize answers to questions more easily, yes. We can make better use of extensive test suites, yes. We cannot give 1000 different correct answers to the same prompt. We cannot read minds.

  • Can you? Read minds, I mean.

    If the answer is "yes"? Then, yeah, AI is not coming for you. We can make LLMs multimodal, teach them to listen to audio or view images, but we have no idea how to give them ESP modalities like mind reading.

    If the answer is "no"? Then what makes you think that your inability to read minds beats that of an LLM?

    • This is kind of the root of the issue. Humans are mystical beings with invisible sensibilities. Many of our thoughts come from a spiritual plane, not from our own brains, and we are all connected in ways most of us don't fully understand. In short, yes I can read minds, and so can everybody else.

      Today's LLMs are fundamentally the same as any other machine we've built and there is no reason to think it has mystical sensibilities.

      We really need to start making a differentiation between "intelligence" and "relevance". The AI can be perfectly intelligent, but without input from humans, it has no connection to our Zeitgeist, no source material. Smart people can be stupid, too, which means they are intelligent but disconnected from society. They make smart but irrelevant decisions just like AI models always will.

      AI is like an artificial brain, and a good one, but humans have more to our intelligence than brains. AI is just a brain and we are more.

      1 reply →

If you have an AI that's the equivalent of a senior software developer you essentially have AGI. In that case the entire world will fundamentally change. I don't understand why people keep bringing up software development specifically as something that will be automated, ignoring the implications for all white collar work (and the world in general).

Then who else is still holding a job if a tool like that is available? Manually working people, for the few months or years before robotics development fueled by cheap human-level LLMs catches up?