← Back to context

Comment by nullc

3 days ago

but what if the alternatives are fundamentally worse? Turns out centralization has a lot of advantages.

I think it's an error to demand the alternatives be as good-- that might not even always be possible. But even if they're less good they're usually still better than anything we could have imagined decades ago-- they're good enough to use.

And that should be enough because we shouldn't consider handing control of ourselves to third parties to be an acceptable choice at all.

Let's dig into what makes them worse, and see what we can do about it.

I think the main struggle is moderation. Moderation requires a hierarchy, which is much more compatible with a centralized model. I'm thinking that curation would be a good alternative. Rather than authoritatively silencing unwanted content, just categorize it well enough for users to filter what they want.

  • I agree with you, but many people have yet to understand that content they disagree with will continue to exist, no matter what, and central gatekeepers are not helpful in eliminating that content.

    The fucking “nazi bar” analogy has ruined an entire generation. You would think after centuries of trying to stamp out competing ideas, humans would finally come to terms with the fact that it cannot be done.

    Small curated groups are the only way to enforce ideological orthodoxy. You cannot force it on the public, nor can you punish the public for holding bad ideas without creating blowback and resistance.

    • I don't think we have to argue against the "nazi bar" analogy, though. In that analogy, nazis are allowed to exist in the world, just not in the bar. The difference is how we implement the concept of "in". The same analogy works if you are out on the street: everyone is allowed to be there, but that doesn't give nazis the right to your attention.

      Until we have a real way to meaningfully process natural language (I have a serious idea for that, but that's another conversation), we won't be able to automate content filtration. The next best thing is ironically similar to what we came here to complain about: attestations in a web of trust. If everything we bother to read is tied to a user identity (which can be anonymous), we can filter out content from any user identity that is generally agreed to be unwelcome. The traditional work of moderation can be replaced by collaborative categorization of both content and publishers. Any identity whose published content is too burdensome to categorize can simply be filtered out completely. The core difference is that there are no "special" users: anyone can make, edit, and publish a filter list. Authority itself is replaced by every participant's choice of filter. Moderated spaces are replaced by the most popular intersection of lists. Identity is verified by the attestation of other identities, based on their experience participating with you.

      1 reply →