Comment by jsnell
3 days ago
The human needs to wait for their computer to solve the challenge.
You are trading something dirt-cheap (CPU time) for something incredibly expensive (human latency).
Case in point:
> If the challenge takes, say, 250 ms on the absolute best hardware, and serving a request takes 25 ms, a normal user won't even see a difference, while a scraper will see a tenfold slowdown while scraping that website.
No. A human sees a 10x slowdown. A human on a low end phone sees a 50x slowdown.
And the scraper paid one 1/1000000th of a dollar. (The scraper does not care about latency.)
That is not an effective deterrent. And there is no difficulty factor for the challenge that will work. Either you are adding too much latency to real users, or passing the challenge is too cheap to deter scrapers.
>No. A human sees a 10x slowdown.
For the actual request, yes. For the complete experience of using the website not so much, since a human will take at least several seconds to process the information returned.
>And the scraper paid one 1/1000000th of a dollar. (The scraper does not care about latency.)
The point need not be to punish the client, but to throttle it. The scraper may not care about taking longer, but the website's operator may very well care about not being hammered by requests.
But now I have to wait several seconds before I can even start to process the webpage! It's like the internet suddenly became slow again overnight.
Yeah, well, bad actors harm everyone. Such is the nature of things.
A proof of work challenge does not throttle the scrapers at steady state. All it does is add latency and cost to the first request.
Hypothetically, the cookie could be used to track the client and increase the difficulty if its usage becomes abusive.
1 reply →