Comment by dualvariable
18 hours ago
Doesn't matter if it is AI hallucinations or entirely human scientific fraud, the problem is the same, and the solution works fine for both cases.
If you can't validate that your bibliography is full of real articles, you shouldn't get published.
LLMs have just poured gasoline on the fire.
No comments yet
Contribute on Hacker News ↗