← Back to context

Comment by ranger_danger

3 days ago

The compute also only seems to happen once, not for every page load, so I'm not sure how this is a huge barrier.

Once per ip. Presumably there's ip-based rate limiting implemented on top of this, so it's a barrier for scrapers that aggressively rotate ip's to circumvent rate limits.

It happens once if the user agent keeps a cookie that can be used for rate limiting. If a crawler hits the limit they need to either wait or throw the cookie away and solve another challenge.