Comment by darksaints
3 days ago
To add to your second point, those algorithms are extremely easy to game by states with the resources and desire to craft narratives. Specifically Russia and China.
There has actually been a pretty monumental shift in Russian election meddling tactics in the last 8 years. Previously we had the troll army, in which the primary operating tactic of their bot farms were to pose as Americans (as well as Poles, Czechs, Moldovans, Ukrainians, Brits, etc.) but push Russian propaganda. Those bot farms were fairly easy to spot and ban, and there was a ton of focus on it after the 2016 election, so that strategy was short lived.
Since then, Russia has shifted a lot closer to Chinese style tactics, and now have a "goblin" army (contrasted with their troll army). This group no longer pushes the narratives themselves, but rather uses seemingly mindless engagement interactions like scrolling, upvoting, clicking on comments, replying to comments with LLMs, etc., in order to game what the social media algorithms show people. They merely push the narratives of actual Americans (not easily bannable bots) who happen to push views that are either in line with Russian propaganda, or rhetoric that Russian intelligence views as being harmful to the US. These techniques work spectacularly well for two reasons: the dopamine boost to users who say abominable shit as a way of encouraging them to do more, and as a morale-killer to people who might oppose such abominable shit but see how "popular" it is.
https://www.bruegel.org/first-glance/russian-internet-outage...
> These techniques work spectacularly well for two reasons
Do they work spectacularly well, though? E.g. the article you link shows that Twitter accounts holding anti-Ukrainian views received 49 reposts less on average during a 2-hour internet outage in Russia. Even granting that all those reposts were part of an organized campaign (its hardly surprising that people reposting anti-Ukrainian content are primarily to be found in Russia) and that 49 reposts massively boosted the visibility of this content, its effect is still upper bounded by the effect of propaganda exposure on people's opinions, which is generally low. https://www.persuasion.community/p/propaganda-almost-never-w...
Notice that the two reasons I mentioned don't hinge on changing anyones mind.
1 - They boost dopamine reward systems in people who get "social" validation of their opinions/persona as an influencer. This isn't something specific to propaganda...this is a well-observed phenomenon of social media behavior. This not only gives false validation to the person spreading the misinformation/opinions, but it influences other people who desire that sort of influence by giving them an example of something successful to replicate.
2 - In aggregate, it demoralizes those who disagree with the opinions by demonstrating a false popularity. Imagine, for example, going to the comments of an instagram post of something and you see a blatant neo-nazi holocaust denial comment with 50,000 upvotes. It hasn't changed your mind, but it absolutely will demoralize you from thinking you have any sort of democratic power to overcome it.
No opinions have changed, but more people are willing to do things that are destructive to social discourse, and fewer people are willing to exercise democratic methods to curb it.
Do you have any evidence that a substantial number of people will be influenced in the way you claim? Again, propaganda generally has no or almost no effect.
2 replies →
> a "goblin" army
Hah, a "monkey amplifier" army! Look at garbage coming out of infinite monkeys keyboards and boost what fits. Sigh
> Specifically Russia and China.
...or USA
In this context, the USA does not need a "goblin army" for pushing domestic propaganda.
Silicon valley is already fully compliant.
What should make us believe any other state propaganda is better, even for its own general population?