← Back to context Comment by gruez 3 months ago Off the top of my head: https://everyuuid.com/https://github.com/nolenroyalty/every-uuid 6 comments gruez Reply johnisgood 3 months ago How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something. gruez 3 months ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 3 months ago 503 :D diggan 3 months ago Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those? gruez 3 months ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 3 months ago [dead]
johnisgood 3 months ago How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something. gruez 3 months ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 3 months ago 503 :D
gruez 3 months ago Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:https://allprivatekeys.com/all-bitcoin-private-keys-list johnisgood 3 months ago 503 :D
diggan 3 months ago Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those? gruez 3 months ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 3 months ago [dead]
gruez 3 months ago even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper. fc417fc802 3 months ago [dead]
How is that infinite if the last one is always the same? Am I misunderstanding this? I assumed it is almost like an infinite scroll or something.
Here's another site that does something similar (iterating over bitcoin private keys rather than uuids), but has separate pages and would theoretically catch a crawler:
https://allprivatekeys.com/all-bitcoin-private-keys-list
503 :D
Aren't those finite lists? How is a scraper (normal or LLM) supposed to "get stuck" on those?
even though 2^128 uuids is technically "finite", for all intents and purposes is infinite to a scraper.
[dead]