Comment by qudat
4 days ago
Wrong. I will spend 30 minutes having the LLM explain every line of code and why it's important, with context-specific follow-up questions. An LLM is one of the best ways to learn ...
4 days ago
Wrong. I will spend 30 minutes having the LLM explain every line of code and why it's important, with context-specific follow-up questions. An LLM is one of the best ways to learn ...
So far, eqch and every time I used an LLM to help me with something it hallucinated non-existant functions or was incorrect in an important but non-obvious way.
Though, I guess I do treat LLM's as a last resort longshot for when other documentation is failing me.
Knowing how to use LLMs is a skill. Just winging it without any practice or exploration of how the tool fails can produce poor results.
"You're holding it wrong"
99% of an LLM's usefulness vanishes, if it behaves like an addled old man.
"What's that sonny? But you said you wanted that!"
"Wait, we did that last week? Sorry let me look at this again"
"What? What do you mean, we already did this part?!"
3 replies →
Which LLMs have you tried? Claude Code seems to be decent at not hallucinating, Gemini CLI is more eager.
I don't think current LLMs take you all the way but a powerful code generator is a useful think, just assemble guardrails and keep an eye on it.
Mostly chatgpt because I see 0 value in paying for any llm, nor do I wish to gice up my data to any llm provider
4 replies →
As long as what it says is reliable and not made up.
That's true for internet searching. How many times have you gone to SO, seen a confident answer, tried it, and it failed to do what you needed?
Then you write a comment, maybe even figure out the correct solution and fix the answer. If you're lucky, somebody already did. Everybody wins.
That's what LLMs take away. Nothing is given back to the community, nothing is added to shared knowledge, no differing opinions are exchanged. It just steals other people's work from a time when work was still shared and discussed, removes any indication of its source, claims it's a new thing, and gives you no way to contribute back, or even discuss it and maybe get confronted with different opinions of even discovering a better way.
Let's not forget that one of the main reasons why LLMs are useful for coding in the first place, is that they scraped SO from the time where people still used it.
I feel like we are just covering whataboutism tropes now.
You can absolutely learn from an LLM. Sometimes.documentation sucks and the LLM has learned how to put stuff together feom examples found in unusual places, and it works, and shows what the documentation failed to demonstrate.
And with the people above, I agree - sometimes the fun is in the end process, and sometimes it is just filling in the complexity we do not have time or capacity to grab. I for one just cannot keep up with front end development. Its an insurmountable nightmare of epic proportions. Im pretty skilled at my back end deep dive data and connecting APIs, however. So - AI to help put together a coherent interface over my connectors, and off we go for my side project. It doesnt need to be SOC2 compliant and OWASP proof, nor does it need ISO27001 compliance testing, because after all this is just for fun, for me.