Comment by prmoustache

7 months ago

> The vast majority of the risk can be realised by a single disgruntled user on a VPN from who knows where posting a lot of abuse material when I happen to not be paying attention (travelling for work and focusing on IRL things)... and then the consequences and liability comes. This isn't risk I'm in control of, that can be easily mitigated, the effort required is high, and everyone here knows you cannot solve social issues with technical solutions.

I bet you weren't the sole moderator of LFGSS. In any web forum I know, there is at least one moderator being online every day and much more senior members able to use a report function. I used to be a moderator for a much smaller forum and we had 4 to 5 moderators any time with some of them being among those that were online every day or almost every day.

I think a number of features/settings would be interesting for a forum software in 2025:

- desactivation of private messages: people can use instant messaging for that

- automatically blur post when report button is hit by a member (and by blur I mean replacing server side the full post by an image, not doing client side javascript).

- automatically blur posts when not seen by a member of the moderation or a "senior level or membership" past a certain period (6 or 12 hours for example)

- disallow new members to report and blur stuff, only people that are known good members

All this do not remove the bureaucracy of making the assessments/audits of the process mandated by the law but it should at least make forums moderable and have a modicum amount of security towards illegal/CSAM content.