← Back to context

Comment by sim7c00

1 day ago

you are not wrong. the only 'sane' approaches ive seen with vibe coding is making a PoC to see if some concept works. then rewrite it entirely to make sure its sound.

besides just weird or broken code, anything exposed to user input is usually severly lacking sanity checks etc.

llms are not useless for coding. but imho letting llms do the coding will not yield production grade code.

Koko the gorilla understood language, but most others of her ilk simlpy make signs because a thing will happen.

Move hand this way and a human will give a banana.

LLMs have no understanding at all of the underlying language, they've just seen that a billion times a task looks like such and such, so have these tokens after them.

  • What does it matter if they have understanding of the underlying language or not? Heck, do humans even have the "understanding of the underlying language". What does that even mean?

    It's a model. It either predicts usefully or not. How it works is mostly irrelevant.

    • Defining what that means exactly is one endeavor. But it's important to the how, because whatever it may mean implies a drastically limited set of capabilities, a ceiling, etc., compared to whatever it may mean - if it weren't the case.

    • interesting take. i dont know a lot about grammarz yet in my own language i can speak fairly ok...

      all i know about these LLMs is that even if they understand language or can create it, they know nothing of the subjects they speak of.

      copilot told me to cast an int to str to get rid of an error.

      thanks copilot, it was on kernel code.

      glad i didnt do it :/. just closed browser and opened man pages. i get nowhere with these things. it feels u need to understand so much its likely less typing to write the code. code is concise and clear after all, mostly unambiguous. language on the other hand...

      i do like it as a bit of a glorified google, but looking at what code it outputs my confidence it its findings lessens every prompt

      2 replies →

    • I think that more often than we'd like to admit, we humans are also just not thinking that much about or understanding what we are communicating, and just outputting the statistically most likely next word over and over.

    • Nobody knows what intelligence is, yet somehow everyone has a strong opinion on what it isn't; after all, how could piecewise affine transformations/markov chains/differential equations EVER do X?

  • There’s been a lot of criticism that Koko’s language abilities were overblown and her expressions were overinterpreted as well.

POC approach seems to work for me lately. It still takes effort to convince manager that it makes sense to devote time to polishing it afterwards, but some of the initial reticence is mitigated.

edit: Not a programmer. Just a guy who needs some stuff done for some of the things I need to work on.