Comment by remexre
7 days ago
> unless that crawl is unreasonably expensive or takes it down for others
This _is_ the problem Anubis is intended to solve -- forges like Codeberg or Forgejo, where many routes perform expensive Git operations (e.g. git blame), and scrapers do not respect the robots.txt asking them not to hit those routes.
No comments yet
Contribute on Hacker News ↗