← Back to context

Comment by Retr0id

8 days ago

No matter where the bar is there will always be scrapers willing to jump over it, but if you can raise the bar while holding the user-facing cost constant, that's a win.

No, but what I'm saying is that these scrapers are already not using GPUs or ASICs. It just doesn't make any economical sense to do that in the first place. They are running the same Javascript code on the same commodity CPUs and the same Javascript engine as the real users. So switching to an ASIC-resistant algorithm will not raise the bar. It's just going to be another round of the security theater that proof of work was in the first place.

  • They might not be using GPUs but their servers definitely have finite RAM. Memory-hard PoW reduces the number of concurrent sessions you can maintain per fixed amount of RAM.

    The more sites get protected by Anubis, the stronger the incentives are for scrapers to actually switch to GPUs etc. It wouldn't take all that much engineering work to hook the webcrypto apis up to a GPU impl (although it would still be fairly inefficient like that). If you're scraping a billion pages then the costs add up.

    • The duration you'd need the memory for is a couple of seconds, during which time you're pegging a CPU core on the computation anyway. It is not needed for the entirety of the browsing session.

      Now, could you construct a challenge that forced the client to keep a ton of data in memory, and then regularly be forced to prove they still have that data during the entire session? I don't think so. The problem is that for that kind of intermittent proof scenario there's no need to actually keep the data in low latency memory. It can just be stored on disk, and paged in when needed (not often). It's a very different access pattern from the cryptocurrency use case.