Comment by Spivak
13 hours ago
Even when it hallucinates it still solves most of the unknown unknowns which is good for getting you unblocked. It's probably close enough to get some terms to search for.
13 hours ago
Even when it hallucinates it still solves most of the unknown unknowns which is good for getting you unblocked. It's probably close enough to get some terms to search for.
I don't think so.How can you be so sure it solves the 'unknown unknowns'?
Sample size of 1, but it definitely did in my case. I've gained a lot more confidence when coding in domains or software stacks I've never touched before, because I know I can trust an LLM to explain things like the basic project structure, unfamiliar parts of the ecosystem, bounce ideas off off, produce a barebones one-file prototype that I rewrite to my liking. A whole lot of tasks that simply wouldn't justify the time expenditure and would make it effort-prohibitive to even try to automate or build a thing.
Because I've used it for problems where it hallucinated some code that didn't actually exist but that was good enough to know what the right terms to search for in the docs were.
I interpreted that as you rushing to code something you should have approached with a book or a guide first.