Comment by anandnair

5 months ago

Coding, especially the type mentioned in the article (building an app based on a specification)—is a highly complex task. It cannot be completed with a single prompt and an immediate, flawless result.

This is why even most software projects (built by humans) go through multiple iterations before they work perfectly.

We should consider a few things before asking, "Can AI code like humans?":

- How did AI learn to code? What structured curriculum was used?

- Did AI receive mentoring from an experienced senior who has solved real-life issues that the AI hasn't encountered yet?

- Did the AI learn through hands-on coding or just by reading Stack Overflow?

If we want to model AI as being on par with (or even superior to) human intelligence, don’t we at least need to consider how humans learn these complex skills?

Right now, it's akin to giving a human thousands of coding books to "read" and "understand," but offering no opportunity to test their programs on a computer. That’s essentially what's happening!

Without doing that, I don't think we'll ever be able to determine whether the limitation of current AI is due to its "low intelligence" or because it hasn’t been given a proper opportunity to learn.

LLMs can fundamentally only do something similar to learning in the training phase. So by the time you interact with it, it has learned all it can. The question we then care about is whether it has learned enough to be useful for problem X. There's no meaningful concept of "how intelligent" the system is beyond what it has learned, no abstract IQ test decoupled from base knowledge you could even conceive of.

>How did AI learn to code?

It didn't, it's just very good at copying already existing code and tweeking it a bit.

>Did AI receive mentoring from an experienced senior

It doesnt even comprehend what an experienced senior is, all it cares about is how frequently certain patterns occurred in certain circumstances.

>Did the AI learn through hands-on coding or just by reading Stack Overflow?

it "learnt" by collecting a large database of existing code, most of which is very low quality open source proofs of concept, then spits out the bits that are probably related to a question.

  • I think we're drastically oversimplifying what "pattern matching" means. It is also one of the fundamental mechanisms by which the human brain operates. I believe we are consciously (or perhaps subconsciously) conditioned to think that human "logic" and "reasoning" are several degrees more advanced than pattern matching. However, I don't think this is true.

    The fundamental difference lies in how patterns are formed in each case. For LLMs, all they know are the patterns they observe in "words" - that is the only "sense" they possess. But for humans, pattern recognition involves continuously ingesting and identifying patterns across our five primary senses—not just separately, but simultaneously.

    For example, when an LLM describes something as "ball-shaped," it cannot feel the shape of a ball because it lacks another sense to associate with the word "ball-shaped." In contrast, humans have the sense of touch, allowing them to associate the word or sound pattern "ball" with the physical sensation of holding a ball.

    • >`It is also one of the fundamental mechanisms by which the human brain operates.

      One of the fundamental mechanisms by which brains operate. The bits we share with every other animal with a brain,

      good luck teaching your dog to code.

      being great at fetching your newspaper in the morning doesn't mean its going to wake up and write you an accounting software package at the end of the year.

      1 reply →