Comment by gruez 1 year ago Off the top of my head: https://everyuuid.com/https://github.com/nolenroyalty/every-uuid 6 comments gruez Reply johnisgood 1 year ago How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something. gruez 1 year ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 1 year ago 503 :D diggan 1 year ago Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those? gruez 1 year ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 1 year ago [dead]
johnisgood 1 year ago How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something. gruez 1 year ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 1 year ago 503 :D
gruez 1 year ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 1 year ago 503 :D
diggan 1 year ago Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those? gruez 1 year ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 1 year ago [dead]
gruez 1 year ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 1 year ago [dead]
How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something.
Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:
https://allprivatekeys.com/all-bitcoin-private-keys-list
503 :D
Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those?
even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper.
[dead]