Comment by IanCal
9 months ago
This is true but misses a key fact, that typical llm errors are different to human errors. Not that they're worse or better but just that you need to understand where and when they're more likely to make mistakes and how to manage that.
No comments yet
Contribute on Hacker News ↗