← Back to context

Comment by mirekrusin

4 days ago

Stochastic parrot? Autocomplete on steroids? Fancy autocorrect? Bullshit generator? AI snake oil? Statistical mimicry?

You don't hear that anymore.

Feels like whole generation of skeptics evaporated.

I think the stochastic part is true and useless. It can be applied to anyone or anything. Yes, the models give you probabilities, but any algorithm gives you probabilities (only zero or one for deterministic ones). You can definitely view the human mind as a complex statistical model of the world.

Now, that being said, do I think they are as good as a skilled human on most things? No, I don't. My trust issues have increased after the GPT-5 presentation. The very first question was to showcase its "PhD-level" knowledge, and it gave a wrong answer. It just happened to be in a field I know enough about to notice, but most didn't.

So, while I think they can be considered as having some form of intelligence, I believe they have more limits than a lot of people seem to realise.

I certainly hold those opinions still, because the models still have yet to prove they are anything worth a person's time. I don't bother posting that because there's no way an AI hype person and I are ever going to convince each other, so what's the point?

The skeptics haven't evaporated, they just aren't bothering to try to talk to you any more because they don't think there's value in it.

  • So you don't even try LLMs regularly?

    And whats with everything else regarding ML progress like image generation, 3d world generation etc.?

    I vibe coded plenty of small things i haven't ever had the time for them. You don't have anything which you wanted to do and can fit in a single page html application? It can even use local storage etc.

  • [flagged]

    • This is why they don't talk to you anymore. The only comparison you can make to a flat earther is that you think they're wrong, and flat earthers are also wrong. It's just dumb invective, and people don't like getting empty insults. I prefer my insults full.

    • The earth is flat until you have evidence of the contrary. It's you who should provide that evidence. We had physics, navigation and then space shuttles that clearly showed the earth is not flat.

      We are yet to have a fully vibe-coded piece of software that actually works. The blog post is actually great because LLMs are very good are regurgitating pieces of code that already exist on a single prompt. Now ask them to make a few changes and suddenly the genie is back in the bottle.

      Something doesn't math out. You can't be both a genius and extremely dumb (retarded) at the same time. You can be, however, good at information retrieval and presenting it in a better way. That's what LLMs are and am not discounting the usefulness of that.

      11 replies →

still haven't see something proving it was not autocomplete on steroids or statistical mimicry

It is all those things.

The Bitter Lesson is with enough VC subsidised compute those things are useful.

Those echoes have grown louder over the past year or so. The only way you've heard less of it is if you buried your head under sand.

It is all those things. It consistently fails to make truly novel discoveries, everything it does is derived from something it trained on from somewhere.

No point in arguing about it though with true believers, they will never change their minds.