Comment by mapt

2 days ago

I don't see that accommodating this behavior is prosocial or technically desirable.

Can you think of a use case?

All sorts of bots want this sort of access, but whether there are legitimate reasons to grant it to them on a non-sharded basis is another question since a lot of these queries do not scale resources with O(n) even on a centralized server architecture.

Given enough time, you'll end up with a lot of legitimate users who follow a huge number of accounts but rarely interact with more than a handful, similar to how many long-time YouTubers have a very high subscriber:viewer ratio (that is, they have way more subscribers than you would expect given their average view count), and there's nothing inherently suspicious about it. People lose access to their accounts, make new accounts, die, get bored, or otherwise stop watching the content but never bother unsubscribing because the algorithm recognized this and stopped recommending the channel's uploads to them.

Bluesky doesn't have this problem yet because it's so young, so the outsized follow counts are mostly going to be from doomscrollers and outright malicious users, but even if it was exclusively malicious users, there is no perfect algorithm to identify them, much less do so before they start causing performance problems. Under those constraints, it makes sense to limit the potential blast radius and keep the site more usable for everyone.

From TFA:

> Generally, this can be dealt with via policy and moderation to prevent abusive users from causing outsized load on systems, but these processes take time and can be imperfect.

So it’s a case of the engineers accepting that, however hard they try to moderate, these sorts of cases will crop up and they may as well design their infrastructure to handle them.