Comment by dazc
8 days ago
I've witnessed a few catastrophes that have resulted in mistakes made via robots.txt, especially when using 'disallow' as an attempt to prevent pages being indexed.
I don't know if the claims made here are true but there really isn't any reason not to have a valid robots.txt available. One could argue that if you want Google to respect robots.txt then not having one should result in Googlebot not crawling any further.
No comments yet
Contribute on Hacker News ↗