Slacker News Slacker News logo featuring a lazy sloth with a folded newspaper hat
  • top
  • new
  • show
  • ask
  • jobs
Library
← Back to context

Comment by cdrini

18 hours ago

Exactly. Identifying crawlers like Google, bing aren't the issue. They obey robots.txt, and can easily be blocked by user agent checks. Non-identifying crawlers, which provide humanlike user agents, and which are usually distributed so get around ip-based rate limits, are the main ones that are challenging to deal with.

0 comments

cdrini

Reply

No comments yet

Contribute on Hacker News ↗

Slacker News

Product

  • API Reference
  • Hacker News RSS
  • Source on GitHub

Community

  • Support Ukraine
  • Equal Justice Initiative
  • GiveWell Charities