Comment by somebehemoth
17 days ago
I know nothing about OCR providers. It seems like OCR failure would result in gibberish or awkward wording that might be easy to spot. Doesn't the LLM failure mode assert made up truths eloquently that are more difficult to spot?
No comments yet
Contribute on Hacker News ↗