← Back to context

Comment by mhuffman

17 hours ago

I don't understand the reasoning behind the "feed them a bunch of trash" option when it seems that if you identify them (for example by ignoring a robots.txt file) you can just keep them hung up on network connections or similar without paying for infinite garbage for crawlers to injest.