← Back to context Comment by gruez 10 months ago Off the top of my head: https://everyuuid.com/https://github.com/nolenroyalty/every-uuid 6 comments gruez Reply johnisgood 10 months ago How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something. gruez 10 months ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 10 months ago 503 :D diggan 10 months ago Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those? gruez 10 months ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 10 months ago [dead]
johnisgood 10 months ago How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something. gruez 10 months ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 10 months ago 503 :D
gruez 10 months ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 10 months ago 503 :D
diggan 10 months ago Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those? gruez 10 months ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 10 months ago [dead]
gruez 10 months ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 10 months ago [dead]
How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something.
Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:
https://allprivatekeys.com/all-bitcoin-private-keys-list
503 :D
Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those?
even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper.
[dead]