Comment by TZubiri

3 days ago

As I understand it, this is Proof of Work, which is strictly not a mouse and cat situation.

It is because you are dealing with crawlers that already have a nontrivial cost per page, adding something relatively trivial that is still within the bounds regular users accept won't change the motivations of bad actors at all.

  • What is the existing cost per page? as far as I know an http request and some string parsing is somewhat trivial, say 14kb of bandwidth per page?