Comment by evidenciary

4 days ago

I don't think the "non-deterministic" accusation is a good one. Same as "hallucination", it's a bit of misdirection.

These LLMs are buggy. They have bugs. They don't do what they promise. They do it sometimes, other times they give garbled output.

This is buggy software. And after years and billions of dollars, the bug persists.