Comment by voidUpdate
1 month ago
> I don't see how you get around LLMs scraping data without also stopping humans from retrieving valid data
Well LLM scrapers love to scrape All The Pages, so just have some disallowed pages in your robots.txt that aren't for humans to see and watch LLM scrapers consume them
No comments yet
Contribute on Hacker News ↗