← Back to context

Comment by neilv

5 days ago

I don't know about this particular case, but, generally... bad actor subreddit moderators have been an occasional thing for well over a decade.

And it's also been widely known for that long that Reddit is an influential venue in which to take over a corner -- for marketing or propaganda.

What's an equal concern to me is how insufficiently resilient Reddit collectively appears to be, in face of this.

A bad actor mod of a popular subreddit can persist for years, visibly, without people managing either to oust the mod, or to take down the sub's influence.

(Subreddit peasants sometimes migrate to a new sub over bad mods, but the old sub usually remains, still with a healthy brand. And still with a lot of members, who (speculating) maybe don't want to possibly miss out on something in the bad old sub, or didn't know what's going on, or the drama they noticed in their feed wasn't worth their effort to do the clicks to unjoin from the sub in question.)

Reddit has a moderation problem, and it's a big one.

They've now been asked to appear in front of Congress to address concerns about politically motivated violence being incited through their platform: https://oversight.house.gov/release/chairman-comer-invites-c...

Personally I believe I've seen more people in the past few years wish a politically motivated death on somebody else via Reddit, than I have anywhere else in my life.

Now if it was "just" the incitements to violence, or if it was "just" the libeling of random businesses, that would be one thing. But the fact that BOTH types of illegal speech are becoming a problem at the same time suggests to me that Reddit's failure to moderate is systemic and total.

It is becoming exhausting watching all of these tech companies commit crimes, or enable someone else to do so, and getting off with a slap on the wrist.

  • Moderation on Reddit has been questionable for a long time and its killing the site. To give some examples:

    - /r/energy used to ban everyone in favour of nuclear energy

    - If you post on /r/conservative you can expect to receive a bunch of bans from unrelated (popular) subs. Doesn't matter what you posted, being associated with that subs "taints" your account enough for some moderators.

    - /r/UnitedKingdom banned me for critizing a government welfare program

    - /r/assassinscreed banned me for critizing a character in their latest game

    For me it makes sense that the smaller subreddits should have the freedom to moderate as they want but the larger reddits should aim to at allow opposing viewpoints to prevent echo chambers from forming. Moderation should be focused on quality, not on viewpoints. Obviously it goes without saying that threats of violence and celebration of murder have no place on any platform.

    The irony is that all this censoring just creates a backlash and further polarisation. If you are only allowed to discuss certain subjects on a "left" space you both create the illusion that the left only cares about a subset of topics and by banning people you create resentment that drives them towards (more welcoming) extreme spaces.

    There's many factors that form the political preferences and opinions of the younger generation but it would not suprise me if for a subset (young college educated males?) of them Reddit heavily contributes towards increased polarisation.

    • > - If you post on /r/conservative you can expect to receive a bunch of bans from unrelated (popular) subs. Doesn't matter what you posted, being associated with that subs "taints" your account enough for some moderators.

      You left out the fact that you can’t post to /r/conservative until the moderators there audit your post history and perform an interview with you to confirm your ideology matches theirs.

      If someone does pass the test they’re allowed to comment. If they make a comment that disagrees with the message the moderators want to push, their commenting privilege is revoked.

      It’s not a real subreddit. It’s a moderator-curated echo chamber. They run it like a propaganda outlet, only allow approved thought from approved commenters, and ban anyone who steps out of line with the mods.

      That’s why every thread you view there will have “load more comments” buttons that never load anything: They remove more comments than you’re allowed to see.

      72 replies →

    • /r/AskBrits banned me for pointing out that there are several threads each day about immigration, each tailor made rage bait. Sometimes they’re not even a question.

      I’ve personally caught a couple of Iranians and Russians brazenly posting such threads at 4am British time (working hours in Tehran) and the moderators did nothing. They simply allow such threads while deleting any thread that goes “is anyone sick of the constant threads about immigration?”

      These threads generate so much engagement from people of all opinions that it makes the sub appear in people’s feeds as recommended content even if they’re not subscribed to the sub. It gives people the impression that there is only one political subject in the UK that gets any discussion.

      I don’t know why the moderators of this sub do this, but the effects of their moderation are clear.

      30 replies →

    • It sounds like a large part of the problem is how important a subreddit name is to popularity. If a subreddit has a good obvious name it is going to collect members and activity even if the mods are awful. Competing subreddits will struggle to attract new users as they need some different less-obvious name.

      I wonder if this could be approached in a way that new subreddits didn't have this disadvantage so that they could compete on mod quality and slowly grow / migrate the community.

      Of course there are advantages to short unique names like readable links. But it seems that this false authority may not be worth the downsides.

      3 replies →

    • I would share my own stories of bans, but they're so ridiculous (including all four of the "strikes" that led to my account ban by the admins) that I wouldn't expect anyone to believe it without evidence, and it all happened many years ago (but I fully expect things are even worse now).

      Although I do notice that r/science is apparently down to "only" about 1300 moderators. I'm pretty sure they broke 2000 at some point. (The large majority of those have been around for at least 5 years; it seems that the Reddit UI caps the displayed age, because I recognize names from much more than 5 years ago.)

      1 reply →

    • IMO, Reddit's main problem (and this certainly isn't unique to Reddit) is that it is a registry of names.

      There can be only one subreddit named r/politics, so whoever gets that name essentially decides how you can talk about politics on Reddit. Same applies to any other subject.

      R/fishing will always sound more credible than r/fishing2 or r/2wqy4f. If there's some kind of fishing controversy, and the mods of r/fishing only allow one side to speak, that side gains a lot more credibility. The other side can move somewhere else, but that place won't have the credibility associated with r/fishing.

      Reddit can try to fight this, but as long as subreddits have unique and memorable names instead of IDs, this is going to be a problem and require them to get their hands dirty.

    • You missed maybe the biggest one, /r/bitcoin, which around 2015 started banning anyone who wanted Bitcoin to actually follow the original design and continue scaling up on-chain transactions. The moderator, some anonymous student (possibly named Michael Marquardt), literally declared anyone who wanted Bitcoin to be used for regular transactions offtopic and banned them on a massive scale.

      When explaining his actions he said something like, "I've moderated forums before so I know how sustained censorship can change a community". And then he set out to do it.

      Reddit has been garbage for a long time and people's reliance on it is a huge problem. Abuse of it redirected Bitcoin onto a fundamentally different path (one nobody had agreed to), simply because of the sustained gaslighting and psychological manipulation its format allows.

      That said, user-driven content moderation sucks everywhere. Wikipedia has the same problem. So does HN to some extent. The future is moderation driven entirely by LLMs with openly published prompts.

      9 replies →

    • I can throw another example /r/lectures was a really cool place were people shared mostly academic lectures. Mod took over, put the sub in approved posts only and is just doing token approves very rarely without any way to reclaim the sub.

    • /r/conservative is probably the most heavily censored echo chamber on Reddit, yet somehow you only take issue with other subreddits flagging participation.

      36 replies →

    • /interestingasfuck banned me for commenting on /asmongold at some point. Not even for the content. Simply for having interacted with /asmongold.

      Edit: To be clear I wasn't picked on by anyone. It's a bot they run. This is a blanket ban that /interestingasfuck extends to anyone who has commented/posted on /asmongold, or any subreddit they consider to be right wing (by USA standards).

    • I was banned from /r/askaconservative for stating a very mainstream position. Mods told me I was "astroturfing".

    • It goes both ways. If you try to post anything remotely criticizing Donald Trump or his government on /r/conservative you'll also get banned. Even if you try to keep it objective.

      32 replies →

    • I got permanently banned from Reddit for participating in a thread debating the death penalty. In which I wrote one comment suggesting we shouldn’t waste a bunch of court costs on mass shooters who are blatantly guilty.

      That was considered “instigating violence” lol

      7 replies →

  • I can’t help but notice that Twitter and TikTok didn’t get called for that session. In November 2023, Twitter went from a zero tolerance policy for violent speech to “we may remove or reduce the visibility of violent speech.” Seems really relevant for the topic of the hearings! And yet.

    I’m thus unwilling to take Rep. Comey’s decision to call Reddit to testify as evidence of anything. Feels more like political theater to me. This doesn’t either condemn or absolve Reddit, it’s just not strong evidence.

    • While Twitter has many problems it does seem to do a reasonable job of not promoting hate and violence towards a large audience. There's many messages critical of immigrants on my timeline but none calling for violence against them (or any other group that Twitter users dislike).

      Meanwhile posts about violence against Trump, Musk or celebrating the dead of Kirk did get massive upvotes and visibility on some of the biggest and most popular subs on Reddit.

      16 replies →

  • “The politically motivated assassination of Charlie Kirk claimed the life of a husband, father, and American patriot. In the wake of this tragedy, and amid other acts of politically motivated violence, Congress has a duty to oversee the online platforms that radicals have used to advance political violence. To prevent future radicalization and violence, the CEOs of Discord, Steam, Twitch, and Reddit must appear before the Oversight Committee and explain what actions they will take to ensure their platforms are not exploited for nefarious purposes,” said Chairman Comer.

    ---------------

    Reddit absolutely does have a moderator problem, as one would expect for a platform that relies on anonymous volunteers, but this might merely be the pretext for a witch hunt. e.g. The Trump administration may actually attempt to track down users who posted anti-Kirk or anti-Trump memes. It might be something even more though. There may be an attempt to coerce these platforms to start moderating in a way that's more favourable to Trump. Reddit is a hotbed of anti-Trump memes after all.

    Protest is the bane of authoritarian regimes. That's why the Trump administration moved to lock down colleges so rapidly early this year. However, online social media also has significant capacity for influencing public opinion. This is why so many authoritarian regimes simply cut off internet access for their people. Others (e.g. China) have attempted to censor, manipulate, and control the internet rather than cutting it off.

    Americans, and the world, should be paying close attention to these hearings. They should also pay attention to any sudden changes in behaviour of these companies. Merely being summoned to a hearing might be enough of a threat to make them give Trump all he asks for.

    • > There may be an attempt to coerce these platforms to start moderating in a way that's more favourable to Trump.

      That’s the real comedy about this; when we like it censorship is good, when we don’t like it (Covid shutdown, anti vax, Jan 6th) censorship is bad. The double standard is shocking, yet completely normalized.

      Besides any attempt to end violent rhetoric has to start with POTUS himself, theater exactly what it is.

      2 replies →

  • There's an article on the reddit blog, still out on archive.org, showing that a huge percentage of the website's traffic comes from... Eglin AFB? in the United States. That base also happens to be home of at least three distinct units that engage in "cyber" stuff.

  • If those are illegal, where are the prosecutions?

    In my understanding, libel is a civil tort, and the victim can sue if they think they have been libeled. And wishing someone dead isn't illegal in the US, though it may be elsewhere.

  • An acquaintance who used to be active on reddit watched an angry mob "dox" his long-time pseudonym (they found a real person by the same name) with instructions to harass his employer and calls for IRL assault. Shortly afterward, his account was permabanned and he was unable to create a new one from the same IP.

    This wasn't just a reddit problem, Twitter had plenty of the same cancel campaigns.

  • How can we know that this or that example of speech is illegal if there are no charges and no trial? This rule by corporate fiat is exactly what we don’t need. It lacks democratic oversight. To say nothing of the way that disingenuous claims of “political violence” is being used to suppress legitimate dissent in our country.

  • > Personally I believe I've seen more people in the past few years wish a politically motivated death on somebody else via Reddit, than I have anywhere else in my life.

    What you'll also see is a lot of accounts banned just for saying that they can't wait for say Vladimir Putin to die. I'm sure there are ways in which you could construe that to be 'politically motivated death' but that's just a weak excuse to ban an account ignoring the deeper subtext. Wanting mass murderers to shuffle off their mortal coil is a net positive for the world.

  • >They've now been asked to appear in front of Congress to address concerns about politically motivated violence being incited through their platform

    Funny how for the last 30 years of right wing violence/extremists far exceeding left wing nothing was done at all about it, no questions asked. Hush hush, don't talk about gun control or the real causes of these peoples' actions.

    But then the year that left wing violence finally exceeds right wing, they all start crying that it's unacceptable and something needs to be done about it.

    Source: https://www.csis.org/analysis/left-wing-terrorism-and-politi... "So far, 2025 marks the first time in more than 30 years that left-wing terrorist attacks outnumber those from the violent far right."

    • >But then the year that left wing violence finally exceeds right wing

      It hasn't even! Like even if you take the Kirk murder as an explicitly left wing murder, "leftist" violence is still not even a shadow of what it used to be in the US

      We used to shoot at business magnates for fighting unionization! The weather underground was an explicitly Marxist organization! The black panthers were a black supremacist organization!

      Your own source makes the point that the reason left wing attacks "outnumber" right wing ones is that right wing attacks have dramatically decreased

      Because when ICE does it, it's not considered a "right wing attack"!

    • > But then the year that left wing violence finally exceeds right wing

      this framing will most likely confuse most people because it's essentially a 100s of murders/mass shootings (by the right) vs 2 murders (by the left, the rest is probably property damage or whatever).

      It's also going to be confusing because Luigi is not a confirmed leftist, the Kirk shooter is not a confirmed leftist, and putting aside the problematic presumption that they are before we have evidence, doing so means totalling up to approx 3 left wing murders since 2020.

      But ICE's current actions would clearly be classified as right wing violence by those standards, which is overwhelmingly well documented and numerous. Some people also might not like that framing, whether it's because you're a right winger or because you're looking for info on non-state sanctioned terrorism, so it wouldn't be a bad idea to give ICE their own category in the next version of some of those charts.

      1 reply →

  • The reason why Reddit is being "investigated" in this way is clearly and without any doubt political and has nothing to do with Reddit's moderation. There are strong anti-free-speech forces in the USA currently, and Reddit is #1 on their target list.

    Anyone who can't see that is blind on the right eye, which is unfortunately a common phenomenon in certain circles nowadays.

    • This is a common theme in the current political climate.

      "If you can't see" <Insert my strongly held ideology> "then you are blind".

    • Many years ago, I looked at front-page threads on r/socialism and found blatant, undeniable calls to political violence all over the place. It was way worse than anything I'd ever seen on r/TheDonald. My reports to the admins went ignored, as far as I could tell.

      2 replies →

  • What country are you from? To "wish a politically motivated death" on someone is illegal there?

    Reddit set itself up as a speakeasy, people speak their minds openly because it appears in some areas to be free of thought policing.

    Do you think it is wrong to wish a dictator dead? Over the past decades USA has not only wished it, but made it happen, at the cost of many lives.

    • Reddit definitely has not set themselves up that way. Many people got banned just for saying they understand and empathize with Luigi's motivations.

      1 reply →

    • > What country are you from? To "wish a politically motivated death" on someone is illegal there?

      It is illegal in most countries, no? Even in USA you aren't allowed to instigate murder.

      6 replies →

    • > What country are you from? To "wish a politically motivated death" on someone is illegal there?

      This is a strawman. Your quoted text does not come from GP and does not fairly represent any of its argument (which makes your use of italics hard to understand).

      Actual incitement to political violence is actually occurring on these platforms. People have screencaps and everything.

> And it's also been widely known for that long that Reddit is an influential venue in which to take over a corner -- for marketing or propaganda.

Capturing moderation of a subreddit has long been a strategy of marketing agencies.

Even when they can’t take over the actual mod positions, they’ll shower the mods with free product and make them feel like a VIP. I watched this happen from inside one company and I couldn’t believe how easily the marketing team turned a mod into our biggest advocate by sending free products to them from time to time.

> A bad actor mod of a popular subreddit can persist for years, visibly, without people managing either to oust the mod, or to take down the sub's influence.

In some of the subreddits I followed, the remaining subreddit users felt some relationship with the mods over time and felt they were on the same side. There are subreddits like /r/nootropics where many users don’t realize the mod team has been captured by a supplement company (Nootropics Depot) and that they have a history of deleting some posts critical of Nootropics Depot. You would think this would be grounds for a subreddit riot, yet whenever I check it feels like everyone there is fans of Nootropics Depot and therefore they get a pass. Note that the quality of the science discussed on /r/nootropics is generally terrible and of very poor quality in recent years, which is certainly a related factor. It’s also not hard to find comments in other subreddits from people who were banned from /r/nootropics.

I think this happens across a lot of subreddits. Moderators find reasons to ban the dissenters and shape the conversation until the hive mind consensus favors the mods, so any issues aren’t discussed. People who object are banned for different reasons and minor infractions, then get tired of Reddit and move on. What remains is captured by companies pushing their products to an audience who thinks the mods are doing them a favor.

  • I wonder if it would work a free speech site to allow mods to not include a story in a category/ subreddit, but then just place that story into, say, /r/changemyview/banned. You'd still need sitewide moderation, but you'd always be able to see the way your feed was being edited within that context.

  • this seems to be happening on city based subs as well where the split is political; creating echo chambers for each side. This feels dangerous as any potential middle ground gets eroded away.

    • It's gone multiple ways in the past for not just city subreddits, but all kinds of regional ones. For example, r/canada has r/OnGuardForThee (because they thought the mods were allowing bigotry) and also (now private) r/RedEnsign (because, more or less, they thought the crowd making r/OnGuardForThee was falsely defaming them as bigots).

Think about a Reddit mod's incentives.

They:

- Don't get paid

- Spend time having to do some really thankless work

- Don't really have a regular work schedule

So what kind of person is going to do it?

Someone who is willing to do the work for no pay. For smaller subreddits and areas where the work of moderation isn't that heavy, you'll find passionate individuals.

Mods that moderate more time consuming content or the power mods modding many subs are chasing some other incentive. For some that means explicitly monetizing their time by pushing products and companies who pay them. For others it's the ideological satisfaction of pushing viewpoints they want pushed and suppressing viewpoints they want suppressed. For some it's prestige. For most it's probably some mix of all three.

What's absent is any incentive to surface organic, human content. That's merely a side effect of what mods do, not their main job.

  • There should be a public service campaign telling users something like "Even in the best case scenario, the moderators are weirdos. Most likely they're shills".

    People with careers, families, friends and hobbies are mostly not going to spend their limited free time being a digital janitor for an anonymous online community.

    People sitting alone in their apartment with nowhere to go and nothing to do and no one to spend time with, however, might find that being a Reddit moderator gives them a hobby, a sense of purpose, and feelings of power, importance or significance that they otherwise never get in real life.

    Someone should make a social media site with inverted dynamics- users who only spend a few minutes per day on the site and post once every few weeks should be treated as the influential power users, while the people lurking and scrolling for 10 hours per day are deprioritized.

    • > Someone should make a social media site with inverted dynamics- users who only spend a few minutes per day on the site and post once every few weeks should be treated as the influential power users, while the people lurking and scrolling for 10 hours per day are deprioritized.

      The problem is most users are the "casuals", by a wide margin, in general; and a lot of them are also "weirdos" in different ways. Some of them will be obsessed with a different site; others have serious issues in spite of all the forms of social proof you describe.

    • I think it's a bit tougher than that. On top of what zahlman said, a lot of "casuals" don't really bring much value to a social media site. If you comment once a year you're not really offering much to the conversation. That's what makes this problem so tricky. The most motivated users are usually motivated by something more than intrinsic motivation. The least motivated users just aren't very good users of the platform. A better incentive structure would help incentivize the "moderately motivated" user.

  • But what if they do get paid, by a competitor? It's very easy to DM a mod and tell them they will get x amount if they skewer the odds in your favor or blast your biggest competitor.

    • What makes you think this doesn't happen? I can almost guarantee it does. If I were willing to pay a Reddit mod off and I saw unfavorable coverage for my brand I'd absolutely try to win the mod over by paying them more than the competitor is paying.

      1 reply →

> A bad actor mod of a popular subreddit can persist for years, visibly, without people managing either to oust the mod, or to take down the sub's influence.

This happens because the regular users have no power. I remember seeing some article that said a small number of mods control most of the popular subreddits. Many of them put their own bias into the system by banning users, banning sources, deleting content based on ideology, shadow banning, etc.

The other issue is as these mods linger for a while, they drive away or ban everyone who might disagree with them. So then the “community” ends up not actually disagreeing with the authoritarian mod. Reddit ends up not being resilient because it doesn’t want to be. Everyone else, is gone.

  • When the mods of major subs are also mods for over a hundred other subs, you have to doubt how much actual moderating they are actually doing in their holier-than-thou positions.

  • Ghislaine Maxwell was maybe one of these powerful mods. But it is another contested conspiracy theory.

    Evidence pasted:

    The Name “Maxwellhill”

    The username directly references “Maxwell,” which is not a common surname. Ghislaine Maxwell grew up at Headington Hill Hall, which was nicknamed “Maxwell Hill” after her father, Robert Maxwell, bought it. This isn’t a vague reference it’s oddly specific and personal. It’s like someone using “EpsteinIsland” as a username and claiming it’s just coincidence.

    Posting Activity Stopped the Day of Her Arrest (actually 2 days before, when she began wrapping her phone in aluminum)

    u/maxwellhill posted almost every day for 14 years and was one of Reddit’s most active users. Then, with no warning, all posting stopped after June 30, 2020. Ghislaine Maxwell was arrested on July 2, 2020. The timing is exact. This wasn’t a slow fade or gradual disinterest. It looks like someone was physically unable to post.

    Gaps in Posting Line Up with Real-Life Events

    There were other suspicious posting gaps during major events in Maxwell’s life. Notably, during her mother’s death in 2013 and during the 2011 Kleiner Perkins party, where she was confirmed to be present by former Reddit CEO Ellen Pao. That party shows Reddit leadership at the time was at least aware of her.

    Moderator of Massive Subreddits

    The account was a lead mod of r/worldnews, r/technology, r/politics, r/science, r/europe, r/upliftingnews, r/celebrities, and more. These are major subs that help shape Reddit’s front page and influence global discourse. Whoever had access to this account had immense control. Even after years of inactivity, Reddit auto added the account back as a moderator in 2024. That suggests the system still treats it like an active, important account.

    The Content

    Maxwellhill posted repeatedly about age of consent laws, often citing obscure countries. They also posted articles defending the legality of child exploitation material and criticized what they called “overzealous” child protection laws. These aren’t normal discussion points for the average Redditor. It reads like someone obsessed with legal gray areas surrounding child abuse.

    Auto Deletion and Censorship

    Mentions of “u/maxwellhill” have been automatically removed from comments in multiple subs. The Daily Dot reported on suspicious deletion behavior tied to the account. Posts about this user “vanished mysteriously,” raising real concerns about censorship. Who or what is protecting the account?

    No Denial from the Account

    If u/maxwellhill is just some random power user, where are they? Why haven’t they logged in to say anything? No posts, no comments, no denials. Nothing for five years. After 14 years of near daily activity, complete silence in the face of serious allegations is suspicious on its own.

    The poster also uses many British expressions in their writing, and listed British foods as their favorite foods in one post.

    Mods of r/WorldNews which is infamously compromised by paid agents demanded her posts be deleted from other subreddits.

    The name matches Maxwell’s family estate. The account vanishes the day she’s arrested. It posted about topics deeply aligned with her known behavior. It held mod control over huge parts of Reddit. It still does. And yet it hasn’t said a word in five years. If this isn’t her, it’s someone with eerily similar patterns, priorities, and timing.

    • It's best if you reserve the term "conspiracy theory" for grand conspiracy theories, which require secret coordination on implausible scales.

      The theory here is merely that an influential socialite (what Maxwell was regularly described as before her arrest) was a reddit addict powermod, that some people running reddit were aware of her identity - not necessarily knowing anything about Maxwell's wider social network or the activities she was convicted of.

      Nothing here is especially implausible. It may or may not be correct, but it's not a grand conspiracy theory, just a theory of everyday shady non-public coordination. It's no more a conspiracy theory than it it's a conspiracy theory that some people in your town sell drugs (yes, they do, and technically they have to engage in "criminal conspiracy" to do so, but we don't call people conspiracy theorists for believing it happens).

      1 reply →

    • No surprised that Reddit moderators are pedophiles, that's pretty obvious just by using it for a horrible site. Run by a bunch of sickos, the owner, spez even had an underage pedo lite sub for years.

Reddit being Reddit wasn't a problem until it became a source of truth and subsequently afforded consensus and an unwarranted sheen of credence by Agentic AI. As the author beautifully (albeit somewhat nihilistically) summarises:

"We have to remember that Reddit isn’t just Reddit anymore. The powers that be have decided that Reddit is infallible, a reliable set of training data for LLMs, and should be featured fucking everywhere."

  • Agreed, Reddit as a source of truth is the issue. Who in the their right mind would look at Reddit as whole and say that is an open, unbiased community focused on true and accurate information. And as the article and comments in this very thread show how moderation and its application within Reddit are "contaminated" which is a very good way to describe the situation.

  • That's really stupid. Anyone spending more than an hour reading reddit comments knows that reddit comments are not some bastion of truth.

    • It is true ... in a way that the truth is a needle in a haystack. And that haystack is filled with knives , needles and other garbage you have to swift through.

Social media should operate under open protocols, including moderation. Choosing moderation should be client-controlled.

These companies burn through VC money to build systems with network effects then turn around and effectively extract rent. Rent extraction is economically parasitic and anti-productive. This is exactly the sort of thing the government should address by mandating open protocols.

  • >Choosing moderation should be client-controlled.

    An idea mostly doomed to failure, the vast majority of people (that are viewing the ads paying for the service) don't want do deal with that bullshit.

    Moderation is a hard problem. You first have the flood/spam attacks that unless instantly dealt with will bring a service to its knees as there will be hundreds of bad messages for every good message creating an enormous bandwidth and filtering cost for each user.

    Then there is a the porn problem. Any place that doesn't instantly block porn will be flooded with porn.

    Then there is the flood of off topic bullshit that shows up in any given channel.

    And from that point there is 20+ other little things that make people feel welcome and want to come to a channel in the first place.

    Simply put anyone could have created and open protocol social media. No one has because it's hard and fraught with problems that your users won't want to deal with.

    • That’s an issue of the front-end not the backend. The backend is where an open protocol is needed to break the parasitism of the social media companies. Whether users deal with spam depends on the moderation policies applied by this or that specific front-end.

      Think of it as a filter. Reddit is a filter on a walled-in social network. What you post there isn’t visible on any other social network and vice versa. But because of that lock-in you are limited to whatever crappy moderation one specific front-end sticks you with, with no alternative if you still want to interact with that social network.

      1 reply →

    • This is honestly why I think a non-free platform is the best way to run in the modern era. Especially with the advent of LLM's. It can even be as cheap as a dollar, and that will solve so many issues at once. (note, it can be higher than $1 if needed. SomethingAwful had the infamous "ten bux" for this process).

      - spam is now too expensive to bother. Free x infinite is free. $1 now means spam costs thousands to try and uphold. Not worth low effort content

      - Rule enforcement is much more tenable now because ban evasion has a cost. Is someone really going to pay $1 each time to try and post some porn or whatever else? 99.9% won't. That will give a feedback look where the community overall should get easier to moderate as it grows, not harder

      - Needing to pay menas you also have a community that at least skews in the adult age. Kids don't/won't have easy access to a credit card for even a $1 payment.

      The main problem is still the same as free platforms, though: network effects are very strong. Adding more hoops will make adoption harder, and that's arguably the hardest part of a new platofrm.

  • Reddit has an easy way to choose moderation, just stop going to the reddits that are poorly moderated.

Reddit has a serious abusive moderator issue. I suspect they will all be demoted to "VIP community member" soon enough and have that entire layer handled by AI. There's just too much ego involved for a human to do a job like that.

Reddit employees are also moderators that also directly influence public opinion and encourage witch hunts.

It’s a systemic Reddit-the-company issue. Google “Ethan Klein vs Reddit” if you want to go down a recent rabbit hole

  • Klein's case is about copyright (and a somewhat thin claim at that; it sure smells a lot like "I'm attempting to use copyright to quash criticism of me," and if the judge decides that's what's actually going on, he's going to lose his case). Unless I missed an update, he's suing Reddit to try and de-anonymize some people running the subreddit so that they can be properly the target of his copyright lawsuit.

    Worth noting: he does not appear to have filed for defamation, which would be the thing he could complain if what they were saying was materially untrue.

A few years ago a NPR (National Public Radio) reporter called Reddit "...a Frankenstein's monster even they can't control."

There's no way to report a malicious sub as far as I can tell. I've been contacted by scammers that look very legit with the green Mod badge that shows in DMs.

This is how the entire internet functions.

We need to separate the web into data, identity, and moderation.

Users need to become aware that they're not using platforms, they are subscribing to moderator control.

Somebody owns ycombinator.com, can decide what is discussed, and if they ban you - us peasants can't tweak who is a moderating / recover your identity and data.

I'm convinced we'll get there eventually, but it starts with recognizing that the only thing special about Reddit is its multi-level-unpaid-moderator-marketing.

>What's an equal concern to me is how insufficiently resilient Reddit collectively appears to be, in face of this.

it's a three-fold issue here.

1. Admins really don't care about moderator behavior. As long as you aren't breaking reddit you'll be ignored. Events like r/wow going private is one of the few times they directly intervene.

2. Moderator rankings is seniority first. Without admin intervention, you can have a "head moderator" who only really acts once a month and they will have the final say on anything in that sub.

3. Network effects. Like anything else the soluion of "start your own subreddit" is a doomed task unless the sub is very new. People will pool around the sub with the most subscribers. So avoiding the bad mod is difficult.

These are issues I was hoping in the '10's they'd attempt to address. But not much has changed to addreess this. At best the rule of only moderating 5 "high-traffic" subs may help the most extreme cases, but I'm not confident.

Isn't the solution to "fork" the community?

If the moderator is really that bad the new community takes over (yes its more complex than that, but broad strokes).

Its not that different then an open source project with bad maintainers.

  • Yes, but how will you get the word out? The moderator can delete all your promotion of the breakaway subreddit within their subreddit. How are you going to get eyeballs?

    And the truly vindictive moderators will start spamming your new subreddit with e.g. child pornography, and then immediately reporting it to the admins. You had best have your own moderator team running 24/7 to cope with intentional sabotage coming from a person who lives their whole life on Reddit, and will stop at nothing to keep control of the little power that they have. You won't be able to pin this sabotage on the moderator, unless you're in their private Discord channel where they coordinate the attack, which you obviously won't be as you're an outsider. Then they will openly gloat about doing it, because they're on the Right Side Of History, and you are Nazis and deserve everything you get.

    Reddit also has default subreddits, or rather had them, but they still hold significant first-mover advantage and enjoy network effects. There's a reason that /r/pics is full of insipid drivel, but there's not a more popular /r/pics2

I'm equally confused at just how bad Reddit is at identifying and removing bad actors to the point that I'm convinced it must be an intentional.

I'm not sure if the reason may be as simple as the desire to pump their user numbers for earnings, or if it's something more egregious than that. It's not clear to me how a company owned by the public which relies on advertisers for revenue has been able to carry on for so long being a propaganda farm for foreign agents and marketing bots.

  • Oh it’s deliberate. It’s been THE online platform for far left radicalization and extremist views for at least a decade now. It’s by far the most intolerant social media platform relative to the mainstream platforms.

    • It was better before they all left twitter.... twitter was far left radicalization , and reddit was mostly on-topic except /r/politics.

> What's an equal concern to me is how insufficiently resilient Reddit collectively appears to be, in face of this.

I don’t see this as a big a problem as you do.

As soon as you solve this, then you have the issue of people you see as good actors being ousted and having their influence taken down. If the bad guys can be silenced then so can the good guys and then it’s just a matter of how we figure out the good guys and the bad guys!

There are lots of very large subreddits that are prolific at shadowbanning people -- you might think you're participating in the conversation for quite a while and people just aren't upvoting you or responding to you for whatever reason, and your posts aren't visible at all. /r/worldnews is very free with them, for example.

Surely bad actors leave a fairly clear data trail. Are there no analytics being used to track this sort of behaviour? Much of the scale of this comes from being able to do it with impunity. If bad actors were exposed, even after the fact, it would be a deterrent to others.

New account just to say I know this feeling very well. Tech-parallel sub has a moderator that does literally nothing other than shittalk a specific group once every 2 weeks. People have mentioned lack of moderation effort.

I can't say who, because the motherfucker is on this website and will instantly deny it all.

> What's an equal concern to me is how insufficiently resilient Reddit collectively appears to be, in face of this.

It's more or less an open marketplace, with only a few high-level rules.

Why would it be resilient to these kinds of attacks? Human society as a whole isn't - if it were, I wouldn't have a job.

> A bad actor mod of a popular subreddit can persist for years, visibly, without people managing either to oust the mod, or to take down the sub's influence.

So, kind of like how bad companies persist in dominant market positions?

Bad actors put in a lot more effort to protect themselves than people with lives and jobs have to take them down. Anyone can bitch about Wells Fargo and Comcast, and 'tyrranical' mods, but at the end of the day, most people aren't switching their ISP or going to a forked community.

You don't have to be a moderator to poison the well.

Post a shitload of bad faith attacks and slander. Not as a root comment. You don't have to actually relate to the parent at all, you're just trying to get your talking points out there. If someone calls you out? Gish gallop never actually addressing their comments. It's another opportunity for you to spew whatever bullshit you want.

If they follow you around and get more engagement/up votes? Block them. Now you are free to continue to post whatever BS you want without any of those pesky fact checkers.

> have been an occasional thing for about a decade.

I'd estimate way higher. Most moderators of meaningful subreddits are corrupt. Occasionally one makes it visible.

Yep. For example anyone can own /r/canada, which seems like a legit Canadian representation for anyone searching about Canada.

And then make it a very opinionated/hatred/political avenue. Maybe of a right wing group.

Not saying it has happened, or that it looks like that. But it can happen very easily.

> What's an equal concern to me is how insufficiently resilient Reddit collectively appears to be, in face of this.

not a bug, a feature. those who can pay for and use the API -- which makes them money -- get to influence the discussion.

that's the business model. they DGAF about free speech or reasonable, well run subreddits so long as they can still get paid.

Not cool you calling users “peasants”, they can’t do anything. Have you posted on Reddit, like, with actual personal opinion? You will quickly find out that it’s a moderator’s walled garden of opinions and your posts removed without explanation and notification. and complaining does not do anything.

  • I think you have it inversed. As I read it, the parent calling the users 'peasants' was to highlight precisely what you're saying. The users have no power, yes? As peasants didn't?