← Back to context

Comment by mixologic

3 days ago

Because its static content that is almost never cached because its infrequently accessed. Thus, almost every hit goes to the origin.

The contents in question are statically generated, 1-3 KB HTML files. Hosting a single image would be the equivalent of cold serving 100s of requests.

Putting up a scraper shield seems like it's more of a political statement than a solution to a real technical problem. It's also antithetical to open collaboration and an open internet of which Linux is a product.