← Back to context

Comment by ksynwa

12 hours ago

> this could just be the beginning of our society beginning to scrutinize these platforms.

Could not be more wrong. "Society" is not deciding anything here. The ban is entirely because of idelogical and geopolical reasons. They have already allowed the good big tech companies to get people hooked as much as they want. If you think you are going to see regulation for public good you will probably be disappointed.

The US gov will do nothing to regulate US owned social networks because they're doing for free the work that the government wants to do itself: collect as much data as possible from each individual. The separation between Meta's collected data and government is just one judicial request away. That's why the US gov hates other countries having this power.

The Tik Tok divestment law was passed by overwhelmingly by both houses of the duly elected Congress. At the time, a majority of Americans polled supported the law, while a minority opposed it: https://abcnews.go.com/Politics/more-support-than-oppose-tik....

In a democracy, this is how "society decides" what's in the "public good." This is not a case where legislators are going behind the public's back, hiding something they know they public would oppose. Proponents of the law have been clear in public about what the law would do and what the motivations for the law are. There is nothing closer to "society decides" than Congress overwhelmingly passing a law after making a public case for what the law would do.

Yes, they're doing it for "ideological and geopolitical reasons"--but those things are important to society! Americans are perfectly within their rights to enact legislation, through their duly elected representatives, simply on the basis of "fuck China."

  • This may in some ways be technically correct, but it is also true that in a democracy, the elite make decisions with the support of the people through manufactured consent. This process involves the manipulation of the populace through mass media, to intentionally misinform and influence them.

    One could take the position that this process is so flawed as to be illegitimate. In this case it would be a valid position to believe that society had not fairly decided these things, and they were instead decided by a certain class of people and pushed on to the rest of us.

    See: A Propaganda Model, by Edward Herman and Noam Chomsky: https://chomsky.info/consent01/

    • That's the notion of "false consciousness" that Marxists trot out to justify why they're right even though people don't agree with them. It's a tool for academics to justify imposing themselves as right-thinking elites who know better than the unwashed masses.

      1 reply →

  • 100% agreed, unfortunately. There is truth in sayings like "the customer doesn't know what's best for them"... I think because they are often simply not informed or intelligent enough.

    • Most people are sufficiently informed and intelligent. They simply don't (1) care about the things you care about; or (2) don't agree with you that your preferred approaches will bring about desired outcomes.

      2 replies →

It can still be both- in the sense that once a precedent is set using the these additional ideological and geopolitical motivations as momentum, maybe there will be an appetite for further algorithm regulations.

As a tech person who already understood the system, it's refreshing that I now often see the comment "I need to change my algorithm"- meaning, I can shape the parameters of what X/Twitter / Instagram/ YouTube / TikTok shows me in my feed.

I think there's growing meta-awareness (that I see as comments within these platforms) that there is "healthy" content and that the apps themselves manipulate their user's behavior patterns.

Hopefully there's momentum building that people perceive this as a public health issue.

  • These bans done for political purposes toward public consent for genocide (ie see ADL/AIPAC's "We have a big TikTok problem" leaked audio, and members of our own congress stating that this is what motivates the regulations) won't lead to greater freedoms over algorithms. It is the opposite direction - more state control over which algorithms its citizens are allowed to see

    The mental health angle of support for the bans is a way the change gets accepted by the public, which posters here are doing free work toward generating, not a motivating goal or direction for these or next regulations

    • > bans done for political purposes

      You want a political body to make decisions apolitically?

      > mental health angle of support

      This was de minimis. The support was start to finish from national security angles. There was some cherry-on-top AIPAC and protectionist talk. But the votes were got because TikTok kept lying about serious stuff [1] while Russia reminded the world of the cost of appeasement.

      [1] https://www.blackburn.senate.gov/services/files/76E769A8-3ED...

      2 replies →

    • Yea, it might be naive to think the government will act in the interest of the consumer (although it has happened before)- but at least maybe it'll continue the conversation of users themselves....

      THis situation is another data point and is a net good for society (whether or not the ban sticks).

      Discussion around (for example) the technical implementation of content moderation being inherently political (i.e., Meta and Twitter) will be good for everyone.

Yeah, the ban is interesting because it’s happened before (company being forced to sell or leave), but never to a product used at this scale. There are allegedly 120M daily active users in the US alone. That’s more than a third of Americans using it every day.

While many have a love hate relationship with it, there are many who love it. I know people who aren’t too sad, because it’ll break their addiction, and others who are making really decent money as content creators on it. So generally, you’re exactly right. “Society” is not lashing back at TikTok. Maybe some are lashing back at American social media companies (eg some folks leaving Twitter and meta products).

But if we wanted to actually protect our citizens, we’d enact strong data privacy laws, where companies don’t own your data — you do. And can’t spy on you or use that data without your permission. This would solve part of the problem with TikTok.

  • While data privacy laws would be good, I don’t see how it would help with TikTok since they have no reason to actually follow the laws when CCP comes calling.

That's because "being hooked" is not why it is being banned. It's banned because people are hooked on it and an adversarial foreign power has the ability to use it for their own gain.

Which is why a viable solution for TikTok was selling it to a US company. If it was just about the population "being hooked", a sale would not be an acceptable outcome.

More specifically the ban is because of the platform being used to support Palestine. There are public recordings of congressmen openly and plainly saying so.

  • Many other platforms have been used for that for even longer, and none of them are in danger of being banned. I don't think this is the real reason, if there is even a singular reason.

    • I believe the singular reason is that TikTok is controlled by the CCP and they use it as a tool to further increase political and social division by manipulating the algorithm.

      This is evidenced by the fact that ByteDance could've sold TikTok in the US for a huge amount of money to comply with the recent legislation, but the Chinese government won't allow the sale. They aren't interested in the money, which to me sounds like they only ever cared about the data and influence.

      Side note: I used Perplexity to summarize the recent events to make sure I'm not totally talking out my butt :). Just a theory though, happy to be proven wrong!

      1 reply →

    • First, they are american platforms, and already do a lot of filtering. It's not easy to ban an American platform either, and there is more leverage to twist their arm.

      Second, how does your comment change the fact that there are multiple politicians on record saying this is why they are going after tik tok?

By “this” I think they meant this moment in time rather than the ban being a result of societal scrutiny.

agree, it was just a shakedown and money grab.

some US oligarchs wanted to buy tiktok at deep discount while it was private, and make money off of making it public company

  • Why would it be sold at a deep discount?

    About 45% of the US population uses TikTok and 63% of teens aged 13 to 17 report using TikTok, with 57% of them using the app daily

    Hell of a product, there would be a crazy bidding war for that kind of engagement

    • Because if the Chinese government actually is using it or plans to use it as a propaganda tool there is no amount of money they would accept. The fact that it wasn't sold to a US company offers credibility to the fact that the product is useless to China if it's controlled by a US company and they wanted to keep the data they learned about addiction to themselves. Also probably wanted to build some outrage among young users for the government banning their favorite app

      The sell or be banned part, instead of just banned, was most certainly lobbied for by the US social media companies hoping to get it on the off chance it had served its purpose, wasn't as useful as China had hoped, or the slim chance they really did just want Americans to copy dance trends.

    • if US government says who is allowed to buy and buyers collude (by pooling financial and political capital together) they can easily not fight a bidding war and lowball instead

      16 replies →