Comment by jjuhl
1 month ago
Why just catch the ones ignoring robots.txt? Why not explicitly allow them to crawl everything, but silently detect AI bots and quietly corrupt the real content so it becomes garbage to them while leaving it unaltered for real humans? Seems to me that would have a greater chance of actually poisoning their models and eventually make this AI/LLM crap go away.
No comments yet
Contribute on Hacker News ↗