Comment by paradite
2 years ago
I recently started my own Discord server and had my first experience in content moderation. The demographics is mostly teenagers. Some have mental health issues.
It was the hardest thing ever.
In first incident I chose to ignore a certain user being targeted by others for posting repeated messages. The person left a very angry message and left.
Comes the second incident, I thought I learnt my lesson. Once a user is targeted, I tried to stop others from targeting the person. But this time the people who targeted the person wrote angry messages and left.
Someone asked a dumb question, I replied in good faith. The conversation goes on and on and becomes weirder and weirder, until the person said "You shouldn't have replied me.", and left.
Honestly I am just counting on luck at this time that I can keep it running.
I'm confused, do you think some individual leaving is a failure state? Realistically I don't think you can avoid banning or pissing some people off as a moderator, at least in most cases.
There's a lot of people whose behavior on internet message boards/chat groups can be succinctly summarized as, "they're an asshole." Now maybe IRL they're a perfectly fine person, but for whatever reason they just engage like an disingenuous jerk on the internet, and the latter case is what's relevant to you as a moderator. In some cases a warning or talking-to will suffice for people to change how they engage, but often times it won't, they're just dead set on some toxic behavior.
> I'm confused, do you think some individual leaving is a failure state?
When you are trying to grow something, them leaving is a failure.
I ran a Minecraft server for many years when I was in high school. It's very hard to strike a balance of:
1. Having players
2. Giving those players a positive experience (banning abusers)
3. Stepping in only when necessary
Every player that I banned meant I lost some of my player base. Some players in particular would cause an entire group to leave. Of course, plenty of players have alternate accounts and would just log onto one of those.
I think it can be a failure state, certainly, but sometimes it's unavoidable, and banning someone can also mean more people in the community, rather than less.
Would HN be bigger if it had always had looser moderation that involved less banning of people? I'm guessing not.
edit: I guess what I was thinking was that often in a community conflict where one party is 'targeted' by another party, banning one of those parties is inevitable. Not always, but often people just cannot be turned away from doing some toxic thing, they feel that they're justified in some way and would rather leave/get banned than stop.
1 reply →
The person leaving is the least bad part of what happened in the OP's example, try reading this again?:
>In first incident I chose to ignore a certain user being targeted by others for posting repeated messages. The person left a very angry message and left.
They have three examples, and all of them ended with the person leaving; it just sounded to me like they were implying that the person leaving represented a failure on their part as a moderator. That, had they moderated better, they could've prevented people leaving.
4 replies →
IME, places (or forums, or social networks, etc.) with good moderation tend to fall into 2 camps of putting that into play:
1. The very hands-off approach style that relies on the subject matter of the discussion/topic of interest naturally weeding out "normies" and "trolls" with moderation happening "behind the curtain";
2. The very hands-on approach that relies on explicit clear rules and no qualms about acting on those rules, so moderation actions are referred directly back to the specific rule broken and in plain sight.
Camp 1 begins to degrade as more people use your venue; camp 2 degrades as the venue turns over to debate about the rules themselves rather than the topic of interest that was the whole point of the venue itself (for example, this is very common in a number of subreddits where break-off subreddits usually form in direct response to a certain rule or the enforcement of a particular rule).
Camp 2 works fine in perpetuity if the community is built as a cult of personality around a central authority figure; where the authority figure is also the moderator (or, if there are other moderators, their authority is delegated to them by the authority figure, and they can always refer arbitration back to the authority figure); where the clear rules are understood to be descriptive of the authority's decision-tree, rather than prescriptive of it — i.e. "this is how I make a decision; if I make a decision that doesn't cleanly fit this workflow, I won't be constrained by the workflow, but I will try to change the workflow such that it has a case for what I decided."
Is people leaving and founding a different forum with different rules really a failure/degradation?
It would be cool if such forks were transparent on the original forum / subreddit, and if they also forked on specific rules. I.e. like a diff with rule 5 crossed out / changed / new rule added, etc.
1 reply →
Discord is particularly tough, depending on the type of community. I very briefly moderated a smaller community for a video game, and goodness was that awful. There was some exceptionally egregious behavior, which ultimately made me quit, but even things like small cliques. Any action, perceived or otherwise, taken against a "popular" member of that clique would immediately cause chaos as people would begin taking sides and forming even stronger cliques.
One of the exceptionally egregious things that made me quit happened in a voice call where someone was screensharing something deplorable (sexually explicit content with someone that wasn't consenting to the screensharing). I wouldn't have even known it happened except that someone in the voice call wasn't using their microphone, so I was able to piece together what happened from them typing in the voice chat text channel. I can't imagine the horror of moderating a larger community where various voice calls are happening at all times of the day.
flamebait directed at specific groups: cliquebait
/s
Imo, some people leaving is not necessary bad thing. Like, some people are looking for someone to bully. Either you allow them bully or they leave. The choice determines overall culture of you community.
And sometimes people are looking for a fight and will search it until they find it ... and then leave.
And sometimes people are looking for a fight and will search it until they find it ... and then leave.
I've found the more likely result is that people looking for a fight will find it, and then stay because they've found a target and an audience. Even if the audience is against them (and especially so if moderators are against them), for some people that just feeds their needs even more.
Wow, and now we all learned that nothing should be censored thanks to this definitely real situation where the same outcome occurred when you censored both the victim and perpetrator
> In first incident I chose to ignore a certain user being targeted by others for posting repeated messages. The person left a very angry message and left.
> Comes the second incident, I thought I learnt my lesson. Once a user is targeted, I tried to stop others from targeting the person. But this time the people who targeted the person wrote angry messages and left.
Makes me think that moderators should have the arbitrational power to take two people or groups, and (explicitly, with notice to both people/groups) make each person/group's public posts invisible to the other person/group. Like a cross between the old Usenet ignore lists, and restraining orders, but externally-imposed without either party actively seeking it out.
What's the problem? Moderation means people are forcibly made to leave, but just as often they'll leave voluntarily. And lack of moderation will also cause people to leave. You'll never be able to moderate in a way that doesn't cause people to be angry and leave. If you try, you'll cause other people to be angry (or annoyed by spam, etc.) and leave. Users leaving isn't a failure.
I think all this just revolves around humans being generally insane and emotionally unstable. Technology just taps into this, exposes it, and connects it to others.
Haha wtf, why would they do that?
> The demographics is mostly teenagers.
How old are you? An adult running a discord server for mentally ill teenagers seems like a cautionary tale from the 1990s about chatrooms.
My interpretation was he ran a discord server for a topic who's demographics happened to include a large number of teenagers and folks with mental illness thus unintentionally resulting in a discord containing a lot of them, not that he was specifically running a discord server targeting mentally ill teens.
I'm afraid I'm too young to understand that reference or context around chatrooms.
Anyway, the Discord server is purely for business and professional purposes. And I use the same username everywhere including Discord, so it's pretty easy to verify my identity.
I doubt it's explicitly for mentally ill teenagers. It could be, say, a video game discord, and so the demographics are mostly teens who play the game, and obviously some subset will be mentally ill.
It's probably something like this. I'm interested in a specific videogame and have bounced around a lot of discords trying to find one where most of the members are older. We still have some under-18s (including one guy's son), but they're in the minority, and that makes everything easier to moderate. We can just ban (or temp-ban) anyone who's bringing the vibe down and know that the rest will understand and keep the peace.
Teens don't have as much experience with communities going to shit, or with spaces like the workplace where you're collectively responsible for the smooth running of the group. They're hot-headed and can cause one bad experience to snowball where an adult might forgive and forget.
About the only thing that makes mentally healthy adults hard to moderate is when they get drunk or high and do stupid stuff because they've stopped worrying about consequences.
1 reply →
> An adult running a discord server for mentally ill teenagers seems like a cautionary tale from the 1990s about chatrooms
It sounds like a potential setup for exploitation, grooming, cult recruitment, etc. (Not saying the grandparent is doing this, for all I know their intentions are entirely above board-but other people out there likely are doing it for these kinds of reasons.)
Discord is already considered a groomer hotspot, at least in joking. You can join servers based on interests alone and find yourself in a server with very young people.
Its in vogue today.
Mental illness or not, your interactions with users in a service with a block button are all voluntary. Unless someone is going out of their own way to drag drama out of Discord, or god forbid, into real life, it tends to be best to just let it happen, as they are entirely willingly participating in it and the escape is just a button away.
Community defined by the most aggressive people that come in tend to be the one where everyone else voluntarily leaves, cause leaving is much better for them.
I see this a fair amount, and yeah, "just let people block others" is really terrible moderation advice.
Besides the very reasonable expectation almost everyone has that assholes will be banned, the inevitable result of not banning assholes is that you get more and more assholes, because their behavior will chase away regular users. Even some regular users may start acting more like assholes, because what do you do when someone is super combative, aside from possibly leaving? You become combative right back, to fight back.