Comment by eggy

12 hours ago

I'm skeptical about banning design patterns just because people might overuse them. Growing up, I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling. Now we binge entire Netflix series and that's fine, but short-form video needs government intervention? The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices? If we're worried about addictive patterns, those exist everywhere—streaming platforms, social feeds, gaming, even email notifications. My concern isn't whether TikTok's format is uniquely dangerous. It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app. I'd rather see us focus on media literacy and transparency than constantly asking governments to protect us from ourselves.

You can't legislate intelligence...

You are not acknowledging the fact that the companies producing these addictive apps are very much doing it intentionally. They are specifically making it as engaging as possible because that's how they make money. And they have billions of dollars to sink into making their products as irresistable as possible.

The average person has zero chance against all-pervasive, ultra-manipulative, highly-engineered systems like that.

It is, quite simply, not a fair fight.

  • That's not wrong, but it's a selective take. The entire economy operates like an addiction machine, using proven psychological techniques to modify individual and collective behaviours and beliefs.

    It's not just social media. It's gaming, ad tech, marketing, PR, religion, entertainment, the physical design of malls and stores... And many many more.

    The difference with social media is that the sharp end is automated and personalised, instead of being analysed by spreadsheet and stats package and broken out by demographics.

    But it's just the most obvious poison in a toxic ecosystem.

    • Every country in the world already does tons of intervention combatting addiction. There are already bans and restrictions on gambling, drugs, alcohol, cigarettes etc… Wether we consider social media addiction to be harmful and how to do it is a good question to be asked, but intervention into harmful addiction is generally uncontroversial.

    • Though capitalism is to blame for plenty of problems, I don't agree with this take, and I see it repeated quite often.

      Economies, capitalist or otherwise, are very much defined by needs and wants. (With this, I presume, you agree already.)

      But addiction is another topic altogether from everyday needs and wants like oil, aspirin, or cinema tickets.

      1 reply →

    • There's a big difference in terms of frequency and availability.

      Physical design of stores gets you when you're shopping, then it's done. Organized religion tends to get its hooks into you once or twice a week. Marketing, PR, ads, all sporadic. Social media is available essentially 24/7 and is something you can jump into with just a few seconds of spare time.

      If more traditional addiction machines are a lottery you can play a few times a week, social media is a slot machine that you carry with you everywhere you go.

      6 replies →

  • >The average person has zero chance against all-pervasive, ultra-manipulative, highly-engineered systems like that.

    So you are saying I am not an average person because I have the willpower to simply not install the TikTok app or watch short form video on any platform?

    Has the bar for the average person really sunk this low?

    • If only you could reach out of your own experience and ponder what might cause otherwise reasonable people to do so. Young people peer pressure, current marketing landscape, you're forced there if you want to make money as a creative, so many reasons. Great, you can live your life without. Can you live your life without assuming everyone has the privilege of your situation?

    • You also probably don't use heroin. Everyone knows it's a bad idea and yet for some reason we have very severe punishments for people that distribute it. Why?

      Because addictive things are addictive, and addicted people suffer, and everyone can get addicted if their guard slips.

      We prefer to regulate highly addictive things instead.

    • > So you are saying I am not an average person because I have the willpower to simply not install the TikTok app or watch short form video on any platform?

      Yes, since more people use Tiktok than not. The average person is also fat today, so this shouldn't come as a surprise to you.

      People didn't grow fat and addicted to screens due to changes to themselves, its due to companies learning how to get people to eat more and watch more since the they make more money.

    • Maybe? I really don't know. I don't want to believe it but the data and just looking around in public and seeing the scroll addition seems to indicate otherwise?

  • It's also very much an exercise in framing, though. Making your media as engaging as possible is the basic imperative of any media company. But choosing to call this specific instance of it "addictive" has everyone up in arms.

    • To the framing issue - I can frame an alternate lens through which we balance enrichment against engagement.

      Media can enrich people - expose them to new ideas, new stories, different views and opinions. This expands worldview and generally trends in the same direction as education.

      Media can also be engaging - Use tools that make it compelling to continue viewing, even when other things might be preferable, on the low end: cliffhangers and suspenseful stories. on the high end: repetitive gambling like tendencies.

      I'd argue if we view tiktok through this lens - banning it seems to make sense. Honestly, most short form social media should be highly reviewed for being low value content that is intentionally made addictive.

      ---

      It's not society's job to cater to the whims of fucking for-profit, abusive, media companies. It's society's job to enrich and improve the lives of their members. Get the fuck outta here with the lame duck argument that I need to give a shit about some company's unethical profit motives.

      I also don't care if meth dealers go bankrupt - who knew!

      6 replies →

    • > Making your media as engaging as possible is the basic imperative of any media company.

      Not so. I think your logic is that engagement often leads to dollars, and the "basic imperative of any company" is to make dollars. There are pro- and anti-social ways to do this. You can create better art for your video games, or you can insert gambling mechanisms. You can spend more time designing your cinematic universe, or you can put a cliffhanger after every episode. You can make a funny skit, or you can say, "wait for it... wait for it... you can't believe what's about to happen!" Optimizing for engagement, for the sake of engagement, is necessarily anti-social. It's trying to redirect attention towards your media without actually making the user experience better in any way.

      Legally, the basic imperative of any company is to make dollars, as long as it is prosocial. You should not expect the government to turn a blind eye to scam centers or disfunctional products. The same applies to the media landscape.

    • Everything's on a spectrum, but there's a point where you're so far along on the spectrum that it makes sense to call it something else.

      See, "quantity has a quality of its own".

      Sometimes you have to leave the theoretical view aside and just look out the window. How are people using this? Is it hurting them? What can we do about it?

      I don't like blanket bans, but putting TikTok and, say, a publishing company marketing novels, in the same category because they strive for an audience, doesn't clarify anything. It just confuses the discussion.

  • I don't think we should allow any form of abusive software, addictive, dark patterns, bait-and-switch. They all need to be robustly regulated.

    At the same time I don't think you can demonstrate harm without good evidence.

    Making money can not be used as a criteria unless you want to draw the conclusion that no company can turn a profit and be ethical at the same time. It would amount to demanding an outcome that you don't believe us possible.

    I think considering overly broad criteria, like say, infinite scroll applied selectively to a few is just arbitrarily targeting candidates for reasons unstated outside the criteria.

    The rules need to be evidence based, clear, specific, and apply to all.

    Cracking down on ticktok while The Guardian has a bunch of dark patterns. Or the NYT, who is reporting on this while at the same time attracting people with online games that have an increasingly toxic user interface.

    Tiktok may suck, but so do a lot of other businesses that escape scrutiny. I worry the harms attributed to TikTok are magnified to allow them to be a whipping boy drawing the focus allowing systemic issues to persist.

  • Where does a desirable product or experience end and addictive begin though? Pretty much all products or services sold are designed to be desirable. Some things are physically addictive (nicotine, opioids etc), so those are a bit more clear. But when we're talking about psychologically addictive, where do we draw the line between what's ok and what's not?

    If my restaurant's food is so good people are "addicted" to it, that's a good thing. If it's about applying psychological patterns to trigger the addictive behavior that applies to a large swath of marketing.

    • You really must be able to understand the difference between liking a thing and being addicted to a thing?

      If not it’s probably worth just starting with basic definitions of addiction.

  • And I’m so glad they did. Tiktok has brought so many positive changes to my life, and it never would have happened if they hadn’t built a product so good that it’s literally addictive. I don’t want the government to be my parent.

    Additionally, Instagram and Facebook have tried their best to make their products as addictive as possible, yet their recommendation algorithm is so absolutely terrible (not to mention their ads) that I barely stay on the platform for five minutes when I use it.

    • What the TikTok algorithm does for me: surfaces exercises for all my joint problems, finds people exploring local sites and reporting on local issues, helps me discover new music, reveals how we treat prisoners, shows me what it's like to do jobs from sitcom writer to oil rig tech

      What Europe does for me: Makes me click "Accept cookies"

      1 reply →

  • What's illegal about intentionally making money for being addictive? "Unfair"? Maybe. But not illegal.

  • I don't like this narrative. I'm a person, and HN is the only social media I use.I tolerate this one because I find the addictiveness off-putting, but unlike other social media HN doesn't engage in that much.

    I'm not some sort of prodigy or anything, just a random schmuck. If I can do it, anyone can. People just really like blaming others for their own vices instead of owning up to having a vice.

    HN is a vice too. One of many that I have. And they're all mine. I've chosen them all. In most cases knowing full well that I probably shouldn't have.

    • > If I can do it, anyone can.

      Right, but they don't. Not to mention a significant portion of the target market are children whose brains are still developing.

      Smoking is a vice. Anyone can stop smoking any time they want. But it was still incredibly popular. Government regulation put warning labels everywhere, tightened regulation to ensure no sales to children, provided support to quit. And then the number of people smoking plummeted. Society is better off for it.

      "Anyone can do it" is an ideological perspective divorced from lived reality.

    • Exactly. It's not that the producers or distributors (of food, content, etc.) are not malicious/amoral/evil/greedy. It's that the real solution lies in fixing the vulnerabilities in the consumers.

      You don't say to a heroin addict that they wouldn't have any problems if those pesky heroin dealers didn't make heroin so damn addictive. You realize that it's gonna take internal change (mental/cultural/social overrides to the biological weaknesses) in that person to reliably fix it (and ensure they don't shift to some other addiction).

      I'm not saying "let the producers run free". Intervening there is fine as long as we keep front of mind and mouth that people need to take their responsibility and that we need to do everything to help them to do so.

      2 replies →

    • You haven't chosen anything. That's the point - the illusion of choice and agency.

      If you can't stop cold at any time if/when you decide to, you don't have the agency to make a free choice.

      1 reply →

    • That feels like it applies to so many things we make illegal, scams of all kind, snake-oil medical sellers, baby powder full of asbestos. Sure, people can handle all of these things, but we've decided, as a society, it's better not to allow them.

      So then the question is, is it better to let these things happen, as a society?

      4 replies →

    • > just a random schmuck

      if you've even on this website you're a tiny niche of the population. You like text? Check out the weirdo over here... oh wait that's all of us.

    • > If I can do it, anyone can.

      This is such a normie perspective and shows just how unfamiliar you are with addiction. Yes, some people can avoid becoming addicted. Yes, some addicts can break the habit and detox and stay clean. At the same time, a larger number of addicts can detox but relapse in a relatively short time. There are also addicts that have not yet admitted they have a problem, and there are addicts that are okay with being an addict. Just because you have the emergency stop button that you can hit does not mean everyone else is the same way. Your lack of empathy is just gross

  • > They are specifically making it as engaging as possible because that's [how they make money.] ... what people want.

    Fixed that for you.

    Your argument is basically the same as saying that Banana Ball should be banned because they are intentionally making the experience as fun as possible, because that's how they make money.

    • You're suggesting that it doesn't matter what children are exposed to / become addicted to because companies should be able to sell what children want? So there's no limits to that in your mind? Should every child be given cocaine because they ask for it? They're certainly given candy, right? You must believe there's no difference between cocaine and candy, I can assure you there is a difference and show you evidence to the contrary, if you're that dense.

      2 replies →

  • The government could spend effort on making a documentary and funding a study on brain scans and a little campaign to show everyone the damage and educate rather than just wielding the ban hammer. Especially because it’s often possible that they can have a different motive for ban hammering even if the reason given is valid.

  • Do they though?

    I’d love to think of myself as an exceptional individual because I don’t use Facebook or TikTok, but most likely I’m not exceptional at all, and other people could also just not use TikTok.

  • I hate this age of zero personal accountability. It's so easy to just not doomscroll, but I should be allowed if I want to.

    • did you see what happened when we tried to decriminalise hard drugs in Vancouver? Feel good for yourself that you have the discipline to have self control, other do not and need help.

      You are free to not use TikTok yourself, no one is stopping you.

      Also drug decriminalisation is very nuanced, I’m not 100% against it, I’m just pointing out just that open drug use spiked after.

      1 reply →

    • Personal accountability is contrary to human nature.

      We are primates dominated by our primitive urges.

  • And it’s also mostly targeting children/teenagers. As a parent you can add limitations on cinema, binging series. You can’t on TikTok.

    I’m quite glad that there is a form of control preventing a company from a different part of the world that don’t really care about the mental health or wellbeing of my kids to creep into their life like that…

    As a parent, it’s not a fair fight and I should not have to delegate that to another private company

> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?

Spoiler: There is no line. Societies (or more accurately, communities) attempt to self-regulate behaviors that have perceived net-negative effects. These perceptions change over time. There is no optimal set of standards. Historically, this has no consideration for intelligence or biology or physics (close-enough-rituals tended to replace impractical mandates).

Short form video has been a total break from previous media and social media consumption patterns. Personally I would support a ban on algorithmic endless short form video. It's purely toxic and bad for humanity

  • People are way too comfortable banning things these days. This is where the term 'nanny state' comes from. A subset of the population doesn't have self control? Ban it everyone. Even if it's a wildly popular form of entertainment with millions of creators sharing their lives, who cares we know better.

    • Even most liberal societies tend to ban addictive things. Alcohol, smoking, gambling, drugs, they are regulated almost everywhere, in one form or another.

      I think that algorithmic social media should be likewise regulated, with at the very minimum ban for minors.

      Note that my focus here on the "algorithmic" part. I'm fine with little or no regulation for social media where your feed is just events in chronological order from contacts that you are subscribed to, like an old bullettin board, or the original Facebook.

      Also, I think we should consider companies that provide algorithmic social media responsible for what they publish in your feed. The content may be user generated, but what is pushed to the masses is decided by them.

    • It's way more complex than "no self control". Social media is addictive by design and is peddled at such scale that it is literally impossible to ignore. It's also backed by billions upon billions of dollars.

      Pitting the average person up against that, then blaming them for having "no self control" once they inevitably get sucked in is not a remotely fair conclusion.

      4 replies →

    • > People are way too comfortable banning things these days. This is where the term 'nanny state' comes from. A subset of the population doesn't have self control? Ban it everyone. Even if it's a wildly popular form of entertainment with millions of creators sharing their lives, who cares we know better.

      Europe wants to ban algorithmic recommendation. You attack a straw-man: banning all the content from creators. If you have any valid argument you should bring them to the discussion instead of creating imaginary enemies.

      Banning harmful design patterns is a must to protect citizens even if it ruffles the feathers of those profiting from their addiction.

      1 reply →

    • > A subset of the population doesn't have self control?

      please fix this to

      A subset of the population who has not yet reached the age of consent

      I think society broadly accepts that there are different expectations for children and adults; the line is currently officially drawn somewhere around 18-21 years old.

      3 replies →

    • The thing is, people who live in Europe actually like that companies aren't allowed take advantage of people in every way concievable.

      I have an ideia, if you don't like regulation that protects people why don't you fuck off to your own country and advocate for it in whatever dystopian hellhole you came from?

    • The videos are the entertainment, not the endless recommendation algorithm.

      Additionally, this is not about self control. The claim is that the algorithm is designed to exploit users. Insiders (including a designer of infinite scroll!) have admitted as much going back years: https://www.bbc.com/news/technology-44640959

      We should be uncomfortable with companies spending huge amounts of money to research and implement exploitative algorithms. We did something about cigarette companies advertising to kids. This action is along those lines.

      3 replies →

    • When most of the market using it is abusive, and a source of abuse, preventing the abuse to continue while it's being investigated, or better apprehended by the population/generations at large, makes sense.

    • The "subset of the population" is not small, and there is no easy way to protect the most vulnerable.

      > it's a wildly popular form of entertainment with millions of creators sharing their lives

      I don't think we should be rewarding those who make a living by creating "content" that serves for nothing but a dopamine rush, and you can bet that those who who put it in the effort to create valuable content would prefer to have one less channel where they are forced to put out content just to satisfy the algorithm overlords.

      2 replies →

    • how do you feel about self control in the face large companies that are spending billions of dollars to intentionally trick you into not having it?

      you can't even be aware of what they're doing, because the algorithms they're using to do it are black boxes

      youtube algorithms have shown evidence that they've lead to radicalization

      would you not draw a line on any of this?

  • Any good research papers on the impact of short form video on the human brain? This is a major cause for the attention crisis we're facing IMO.

  • Your short form comment is in violation of EU Directive 20.29A. Agents have been dispatched to your home to collect your devices.

To that end, there's no logical reason entertainment exists at all. There's a biological advantage to finding community members entertaining, but anything that broadcasts that entertainment to another community is just exploiting human nature.

By the logic of the court decision, anything that is entertaining should be banned, from movies to TV shows to any news that makes any analysis whatsoever.

The best way for tiktok to respond to this , is to add some "cooling down" delay between videos. The EU commision will boast about this achievement, but effectively tiktok users will spend MORE time on their app.

It's not about banning design patterns. It's about removing the harmful results they produce.

Can you imagine if gambling were allowed to be marketed to children? Especially things like slot machines. We absolutely limit the reach of those "design patterns".

  • This argument falls apart in the EU though. Where it's legal for 14 year olds to drink alcohol.

    • That's not because EU countries want people to make their own decisions, it's because not so many people in EU think alcohol is bad for kids.

You could make the same argument about sugary beverages, that you can't legislate intelligence, yet every country that has imposed a considerable sugar tax has seen benefits across the board. This of course omits a lot of nuance, but the main takeaway remains the same. We all have that monkey brain inside us and sometimes we need guardrails to defend against it. It's the same reason we don't allow advertising alcohol and casinos to kids, and many other similar examples. (Or at least we don't allow it where I'm from, maybe the laws are different where you're from.)

  • >every country that has imposed a considerable sugar tax has seen benefits across the board

    Is there strong evidence for that? The first study that pops up if I search this suggests otherwise, that it could increase consumption of sugar-substitutes and overall caloric intake. https://doi.org/10.1016/j.tjnut.2025.05.019

    >we need guardrails to defend against

    There is no "we". You say that I and others need it, and you want to impose your opinion by taxing us.

    • This is honestly a very silly take. You could make the same counterargument about any tax of any kind, or really any law of any kind. Like it or not, we do need both taxes and laws to function as a society.

      1 reply →

I don’t think the addictive argument is being made in good faith. Any platform with an infinite scroll feed and titillating content is intentionally made to be like a slot machine. Just keep swiping and maybe you’ll get that little dopamine hit. The idea that TikTok is dangerous, but Twitter, Instagram, porn, alcohol, and Doritos are fine doesn’t come across as an internally consistent argument. I think that the reality is that those who have an actual say in legislation perceive these platforms as a mechanism of social control and weapon. Right now the weapon isn’t in the “right“ hands.

My preferred solution would be to subsidize tools that allow people to better identify and resist compulsive behaviors. Apps like Opal and Freedom that allow you to monitor your free time and block apps or websites you have a troubled relationship with would probably see more use if everybody was given a voucher to buy a subscription. Funding more basic research into behavioral addictions like gambling, etc (ideally research that couldn’t be used by casinos and sports gambling apps on the other side). Helping fund the clinical trials for next Zepbound and Ozempic.

Gambling mechanics are also banned for certain ages and in some countries for everyone. We don’t say that it’s just a game, and people should just control themselves. Without going into the specifics of this case, design pattern intervention have existed for a long time and it has been in most cases desirable.

  • And there are grey areas for gambling that have been settled on, like how video game "loot boxes" were recently reconsidered as gambling in some places (besides just being stupid).

I'm skeptical about banning sales of tobacco and alcohol products to children because children may (over)use them.

Also do we trust adults prescribed oxytocin to manage their use?

We are speaking of weaponized addiction at planetary scale.

I'd go as far as saying every film ever made should have to have a concrete ending and stand on it's own. I am however much less into "freedoms" as I get older and see people become crackheads for apps and the worst form of capitalism possible where market breaking hoarders and resellers get rich denying people both necessities and wants in equal measure. I'm also radical enough to think that it should be illegal to own more than one house, more than one car for every licensed member of a household, and reselling anything for profit. I guess put simply, I hate resellers. I hate hanging threads, and I hate people that design things to constantly leave people wanting or "needing" more.

You should be able to pick your own algorithm. It’s a matter of freedom of choice.

  • Yeah, I think that's the new thing these days. Companies have always been trying to make things addictive, but now they can target each and every individual. I wonder if we had strong privacy laws, if it were illegal for TikTok to have this private information about you.

  • So I choose an entirely chronological one, containing only that content created by my close friends and family.

    Except, I'll never be given that choice.

You can regulate power imbalances though, which is what every individual has versus a multinational with vast resources.

The only reason the US and Europe are targeting TikTok is because they don't own the platform. Facebook and WhatsApp (owned by Meta) are responsible for so much hate politics and social unrest around the world (Facebook and Genocide: How Facebook contributed to genocide in Myanmar and why it will not be held accountable - https://systemicjustice.org/article/facebook-and-genocide-ho... ). Amazon, Google and Microsoft helped the Israelis conduct the genocide in Gaza with their AI tools (UN Calls Out Google and Amazon for Abetting Gaza Genocide - https://progressive.international/wire/2025-08-26-un-calls-o... ). But all that's OK.

  • Yeah, I don't like the reason either. They should've just banned TikTok day 1 as reciprocity with China banning our sites. Instead it was allowed until it started promoting wrongthink.

  • The US government would have to demonstrate improving people's lives to get votes if they couldn't campaign entirely on hate politics. Obviously they prefer the hate politics and ragebait attention algorithms. That way they can funnel billions of dollars to themselves and their buddies instead of wasting it on services supporting US citizens.

> It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app

I think there's a wide regulatory spectrum between those extremes--one that all sorts of governments already use to regulate everything from weapons to software to antibiotics.

It's easy to cherry-pick examples where regulation failed or produced unexpected bad results. However, doing that misses the huge majority of cases where regulation succeeds at preventing harms without imposing problematic burdens on people. Those successes are hard to see because they're evidenced by bad outcomes failing to happen, things working much as they did before (or getting worse at a slower rate than otherwise might happen).

It's harder to point to "nothing changed" as a win than it is to find the pissed-off minority who got denied building permits for reasons they disagree with, or the whataboutists who take bad actions by governments as evidence that regulation in unrelated areas is doomed to failure.

> Now we binge entire Netflix series and that's fine

I mean, that's specifically fine because we have ample evidence to suggest it's just kind of a shit way to watch shows, and Netflix continually taking their own business model out back and shooting it doesn't really warrant government intervention

More and more businesses are shifting their operations and outreach to IG and TikTok, so deciding how to live in a society is increasingly becoming "live under a rock" or "enter the casino and hope to not get swallowed up by the slop".

>I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling.

The argument against tiktok (and smartphones in general) is not that experiences above a certain threshold of compellingness are bad for you: it is that filling your waking hours with compelling experiences is bad for you.

Back when he had to travel to a theatre to have them, a person was unable to have them every free minute of his day.

I'm also skeptical about banning products like opium or methamphetamine, just because people might overuse them.

> people might overuse them ... cliffhangers and sequels

I once heard some try to understand pornography addiction by asking if it was comparable to a desire to eat a lot of lemon cookies. To quote Margaret Thatcher, "No. No. No."

> Where do we draw the line

Just because it's hard to find a principled place to draw the line doesn't mean we give up and draw no line. If you are OK with the government setting speed limits, then you're OK with lines drawn in ways that are intended to be sensible but are, ultimately, arbitrary, and which infringe on your freedom for the sake of your good and the public good.

> trust adults

Please do not forget the children.

> You can't legislate intelligence

Your implication is that people who are addicted to TikTok or anything else are unintelligent, dumb, and need to be educated. This is, frankly, an offensive way to engage the conversation, and, worse, naive.

> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?

We do it for alcohol and cigarettes already: taxes, ads & marketing restrictions, health warning mandated communication.

> didn't make cliffhangers and sequels any less compelling

Apples to oranges.

I can’t make meth in my basement as a precursor to some other drug then complain that my target product had a shitty design.

Real life experience shows that TikTok is harmfully addictive and therefore it must be controlled to prevent negative social outcomes. It’s not rocket science, we have to be pragmatic based on real life experience, not theory.

I am just as uncomfortable with this banning of ideas, or to look at it another way, banning designing it this way simply because it’s effective. I assume this exact same design would not be made illegal if it were terrible at increasing engagement. However I also have to acknowledge that I already can’t stand what TikTok and its ilk have done to attention spans and how addictive they are even across several generations. People just end up sitting there and thumb-twitching while the algorithm pipes handpicked slop into their brains for hours a day. I really don’t want a world where everything is just like this, but even more refined and effective. So, it’s tough to argue that we should just let these sociopaths do this to everyone.

Arguably, the best reason for the government to care is that whoever controls this algorithm, especially in a future when it’s twice as entrenched as it is today, has an unbelievably unfair advantage in influencing public opinion.

> I'm skeptical about banning design patterns just because people might overuse them.

I used to be opposed, now I'm not. I strongly believe human specialization is the important niche humans have adapted, and that should be encouraged. Another equally significant part of human nature is, trust and gullibility. People will abuse these aspects of human nature to give themselves an unfair advantage. If you believe lying is bad, and laws should exist to punish those who do to gain an advantage. Or if you believe that selling an endless, and addictive substance should restricted. You already agree.

There's are two bars in your town, and shady forms of alcohol abound. One bar is run by someone who will always cut someone off after they've had too many. And goes to extreme lengths to ensure that the only alcohol they sell is etoh. Another one is run by someone who doesn't appear to give a fuck, and is constantly suggesting that you should have another, some people have even gone blind.

I think a just society, would allow people to specialize in their domain, without needing to also be a phd in the effects of alcohol poisoning, and which alcohols are safe to consume, and how much.

> Growing up, I had to go to the theater to see movies, but that didn't make cliffhangers and sequels any less compelling. Now we binge entire Netflix series and that's fine, but short-form video needs government intervention?

Yes, the dopamine feedback loop of short form endless scrolling has a significantly different effect on the brain's reward system. I guess in line with how everyone shouldn't need to be a phd, you also need people to be able to believe the conclusions of experts as well.

> The real question is: where do we draw the line between protecting people from manipulative design and respecting their ability to make their own choices?

It's not as linear of a distinction. We don't have to draw the line of where we stop today. It's perfectly fine to iterate and reevaluate. Endless scroll large data source algorithm's are, without a doubt, addictive. Where's the line on cigarettes or now vapes? Surely they should be available, endlessly to children, because where do you draw the line?

(It's mental health, cigarettes and alcohol are bad for physical health, but no one (rhetorical speaking) gives a shit about mental health)

> If we're worried about addictive patterns, those exist everywhere—streaming platforms, social feeds, gaming,

I'd love to ban micro transactions and loot boxes (gambling games) for children.

> even email notifications.

reductive ad absurdism, or perhaps you meant to make a whataboutism argument?

> My concern isn't whether TikTok's format is uniquely dangerous.

Camels and Lucky Strike are both illegal for children to buy.

> It's whether we trust adults to manage their own media consumption, or if we need regulatory guardrails for every compelling app.

We clearly do. Companies are taking advantage of the natural dopamine system of the brain for their advantage, at the expense of the people using their applications. Mental health deserves the same prioritzation and protection as physical health. I actually agree with you, banning some activity that doesn't harm others, only a risk to yourself, among reasonably educated adults is insanely stupid. But that's not what's happening.

> I'd rather see us focus on media literacy and transparency than constantly asking governments to protect us from ourselves.

I'd rather see companies that use an unfair disparity of power, control, knowledge and data, be punished when they use it to gain an advantage over their consumers. I think dark patterns should be illegal and come with apocalyptic fines. I think tuning your algorithm's recommendation so that you can sell more ads, or one that recommends divisive content because it drives engagement, (again, because ads) should be heavily taxed, or fined so that the government has the funding to provide an equally effective source of information or transparency.

> You can't legislate intelligence...

You equally can't demand that everyone know exactly why every flavor of snake oil is dangerous, and you should punish those who try to pretend it's safe.

Especially when there's an executive in some part of the building trying to figure out how to get more children using it.

The distinction requiring intervention isn't because these companies exist. The intervention is required because the company has hired someone who's job is to convince children to use something they know is addictive.

What an unworldly remark. So, we should also not ban hard-drugs then?

  • Yeah, prohibition is a terrible policy for everyone except the cops, jailers (including private, for-profit jailers), government spooks, smugglers, arms dealers, hitmen, chain and shackle manufacturers, etc. who make a living from it. I'm taxed to pay some of the world's most odious people to stop a small percentage of the supply of these drugs. Meanwhile, the vast majority of the supply makes it through and causes untold suffering for addicts, often thanks to other (or the same) taxpayer-funded bad guys and an onramp provided by the legal pharmaceutical industry. In the impoverished countries where the supply comes from, all this revenue funds hellish slave/feudal economies where a small violent elite terrorize, torture, and kill working people. Even in the developed world, addicts are weaponized by others for all kinds of violence (drug gangs, human trafficking rings, etc.) and net-negative property crime (stripping copper from abandoned houses, stealing catalytic converters, etc.).

    In short, banning hard drugs is very very obviously a losing policy that serves only to enrich the world's worst people at the expense of everyone else.

  • > So, we should also not ban hard-drugs then?

    Is this a serious question? Have you been asleep since 70s and are not aware on how the War on Drugs has been going?

  • Yes, many intelligent people DO think we should not ban any drugs/substances and that the best way to deal with them is instead regulate and set up societal structures and frameworks that support the issues around abuse.

    The science tends to back these ideas up. Banning does not stop people from doing what they want.

    Education and guard rails are always better than hard control.