← Back to context

Comment by miki123211

9 hours ago

Wikipedia is also uniquely cacheable.

I suspect that 95+% of visits to Wikipedia don't actually require them to run any PHP code, but are instead just served from some cache, as each Wikipedia user viewing a given article (if they're not logged in) sees basically the same thing.

This is in contrast to E.G. a social network, which needs to calculate timelines per user. Even if there's no machine learning and your algorithm is "most recent posts first", there's still plenty of computation involved. Mastodon is a good example here.

The move away from "most recent posts first" is because that's actually harder at scale than the algorithmic timeline.