← Back to context

Comment by bogdanoff_2

1 month ago

The solution to this would be a law forcing these sites to allow third-party suggestion algorithms, so that you can choose who and how content is being suggested to you.

It could be perhaps as simple as allowing third-party websites and apps for watching Youtube on your phone. And it's okay if this would be a premium paid feature, so there's no counter argument that "it costs them money to host videos".

This is not an entirely new idea either. Before Spotify became popular, people would integrate Last.FM into their media players to get music recommendation based on their listening history, and you could listen to music via YouTube directly on the last.fm website.

The solution to all of Big Tech's monopolies is actually pretty simple: Interoperability must become a law - this includes using custom algorithms or allowing other platforms (like your own app) to access YOUR data on whatever platform 'hosts' it.

Cory Doctorow wrote a great article on it:

"Interoperability Can Save the Open Web" https://spectrum.ieee.org/doctorow-interoperability

> While the dominance of Internet platforms like Twitter, Facebook, Instagram, or Amazon is often taken for granted, Doctorow argues that these walled gardens are fenced in by legal structures, not feats of engineering. Doctorow proposes forcing interoperability—any given platform’s ability to interact with another—as a way to break down those walls and to make the Internet freer and more democratic.

Most notably, he retells how early Facebook used to siphon data from its competitor MySpace and act on user's behalf on it (e.g. reply to MySpace messages via Facebook) - and then when the Zuck(er) was top dog, moved to made these basic interoperability actions illegal by law to prevent anyone doing to him what he did to others.

  • We can’t depend on these platforms to offer interoperability or even laws to force them to do so. The DMA forced Apple to allow 3rd party app stores in Europe and they still hampered it so rarely anyone uses it.

    We need platforms to offer that interoperability and simply connect to these “marketplaces.” Take Shopify for example, sellers use that platform to list on Amazon, Google Shopping, TikTok shop, etc. We need open source alternatives to those where the sellers own the platform and these marketplaces are forced to be interoperable or left behind by those that are.

    For Facebook, Instagram, Twitter, each person having their own website where they post and that post being pushed to these platforms is also another way to force interoperability on them or be left behind.

    It’s a tall task, but achievable and it will happen given enough time.

    • > For Facebook, Instagram, Twitter, each person having their own website where they post and that post being pushed to these platforms is also another way to force interoperability on them or be left behind.

      There's an acronym for this: POSSE (Publish [on your] Own Site, Syndicate Elsewhere). Part of the IndieWeb movement, for those who want to explore this worthwhile idea further.

      1 reply →

    • How will it happen? Writing open source code is one thing, maybe enough people will volunteer their work. But running an operational marketplace or social media platform is something else entirely. You need a real revenue stream to pay for hardware, connectivity, operations staff, regulatory compliance, etc. That stuff isn't cheap.

      1 reply →

  • The foundational problem with interoperability is that it can and will immediately be abused by bad actors as long as there is no price tag attached to every piece of communication.

    Among social media, Mastodon (and anything Fediverse) has it the worst, obviously, but Telegram and Whatsapp are rife with spams and scams, Twitter back when it still had third-party apps was rife with credential and token compromises (mostly used to shill cryptocurrencies).

    As for the price tag reference - we've seen that with SMS. It used to be the case that sending SMS cost real money, something like 20 ct/message. It was prohibitively expensive to run SMS campaigns. But nowadays? It's effectively free at scale if you go the legit route and practically free if you manage to get someone's account at one of the tons of bulk SMS providers compromised. Apple's iMessage similarly makes bad actors pay a lot, because access to it is tied to a legitimate or stolen Apple product serial.

    • But bad actors already do this, as there is a monetary incentive to implement adversarial interoperability. There is then an incentive to not scale it up too much, lest that implementation get cut off sooner. For example, I certainly don't think all of the spam ads I see on Faceboot Marketplace are from individual people manually creating accounts and typing them out.

    • Paywalls can have the opposite of the effect you want. Implemented incautiously, they can fail to disincentivize parties who can make profit in excess of the cost, and it can succeed at disincentivizing genuine, non-profit-motivated interaction.

      Imagine how much less you would use text messages if they still had a per-message cost.

      1 reply →

    • Because some hostile entity might rat fuck the a slightly better system, we're destined to use the same current shitty system because something better might have a downside?

      Do you understand that this is all literally made up? The rules can change anytime and society can exert its will to make better world rather than letting a dozen people decide how technology will shape humanity (mostly in a negative capacity if you look at the current state of things).

      4 replies →

    • This is a confusing comment. Interoperability and bad actors are separate concerns, because you get bad actors in systems of all kinds, not just in interoperable systems. Paywalling a system does not necessarily mitigate bad actors, either.

  • Breaking up these monopolies would be a good start. We aren't supposed to have those. There used to be something we called "regulations" but they got rid of that part I think. Elections have consequences.

  • That just leads to embrace/extend/extinguish

    • Exactly. The deal of all these platforms is that there is a fuckton of up-front costs. Hard drives. Networks. Peering. Transit. Operators. Payment. Lawyers. SREs. And so on and so forth.

      The solution to this used to be that governments provide the platform. You would think this wouldn't be hard to do, since people have now shown that this can work and so it's a guaranteed money maker, or as close as you're going to get.

      Yet I can't find a single initiative.

      So any such rules will just make all internet platforms disappear ... and nothing.

      1 reply →

  • Be careful what you wish for. Making it easier to access your data in a standard way just means more companies and governments will ask for it.

It seems likely that'd result in even worse suggestions becoming the norm as people adopt the third-party that gives the quick dopamine rush. It's like suggesting tastier heroin to fix drug addiction.

  • There's a difference between addictiveness and enjoyment, and definitely between addictiveness and satisfaction.

    While the thing that gives you quick dopamine might win in the very short term, you can still step back and recognize when it's not satisfying in the long term and you're not even enjoying it that much.

    And people aren't stupid. Junk food exists, yet lots of people choose to eat more wholesome food as the majority of their diet.

    The problem with instagram or youtube is that you can't separate the good from the bad.

    It's like if every time you went to store Y to buy milk, you would be exposed to highly manipulative marketing trying to get you to buy junk food. You would probably want to go to a different store instead.

    What I'm suggesting is the possibilities of different stores, with different philosophies and standards, so that people can choose where they go. Corner stores (where almost everything is junk food) exist, yet people still choose to go to real supermarkets.

    • > It's like if every time you went to store Y to buy milk, you would be exposed to highly manipulative marketing trying to get you to buy junk food.

      But that's very much the norm at supermarkets?

  • Parent poster has some… interesting and popular but entirely false views on neuroscience. Specifically, an extremely outdated view on concepts like the role of dopamine and dopaminergic neuronal populations in human cognition. Rather than an understanding based on science and the idea that incentice salience and valence is modulated by such populations, he is attributing pleasure and enjoyment to them because of a meme.

  • Certainly not. People don’t want the slop they push, the anxiety provoking, salacious, clickbaity spam that it has devolved into. Anybody that used YouTube before the last few years can tell you the difference is pretty major. This is not content people want, it’s content that maximizes clicks and ad sales.

    • People don't want to want it. But it's not obvious that merely allowing a choice of recommendation algorithms would allow people to escape the slop. Isn't anyone strong enough to choose a less addictive algorithm necessarily strong enough to not scroll Instagram for hours in the first place?

      3 replies →

    • > People don’t want the slop they push…

      That's also true for heroin. Plenty of people really want to break the addiction.

      The slop exists because people are attracted to it.

      5 replies →

Anything that’s a premium paid feature will be irrelevant. Most people don’t subscribe to YouTube premium, even though they know their kids are watching a ton of ads. Adoption has also been incredibly brisk on the ad tiers of the formerly ad-free TV services like Netflix and Hulu.

I realize “less addictive algo” is a different thing to pay for than removing ads - but it’s, if anything, an even harder sell - I think the layperson wouldn’t even acknowledge that they are vulnerable to being psychologically manipulated. They think they spend so much time on these apps because it’s so enjoyable.

From most parents’ point of view, paying a monthly bill for their children to have a less toxic experience on TikTok, or YouTube will be considered an extravagance instead of a responsible safety expense.

Third-party recommendation algorithms would be interesting, but I think they'd only address one layer of the addictive design the verdict is actually about. Autoplay, infinite scroll, notification timing, the variable reward patterns from likes and comments -- those are all independent of which algorithm picks the next video. You could swap in the most wholesome recommendation engine imaginable and a kid is still gonna sit there for hours if the UI is designed around endless content with no natural stopping points.

The real solution is going back to a chronological feed of people you actively choose to follow.

  • At the very least, that should certainly be an option that users can select. And when the user selects a feed algo, it should stay fucking set until that same user actively chooses to change it.

Bluesky does this. In fact, the For You algorithm is a community built algorithm and way more popular than the native Discover algo.

> Before Spotify became popular, people would integrate Last.FM into their media players

I still scrobble to Last.fm from Spotify (and other media players). I rarely use it for discovery anymore, but it's occasionally interesting to look at my historical listening trends.

This seems like a clever (but perhaps overly clever) amendment to Section 230 protections for social media.

However, I've always thought that it's pretty bizarre for Section 230 protections to apply when the social media company has extremely sophisticated algorithms that determine how much reach every user-generated piece of content gets. To me there's really no distinction between the "opinion" or "editorial" section of a traditional media publication and the algorithms which determine the reach of a piece of user-generated content on Twitter, YouTube, etc.

Or just stop suggesting content. The landing page is just a matrix of already followed accounts with the text "Start by following some accounts you like..." as a placeholder if it's a new account.

I’m quite bullish on disintermediating the algorithms. AI makes it very easy to plug in your own. We just haven’t figured out the plumbing yet.

I’d be strongly in favor of interoperability laws to pry open the monopolies.

(One dynamic you do need to be careful about especially at first - interoperability also means IG can pull your friend graph from Snapchat, so it can also make it easier for big companies to smother smaller ones that are getting momentum based on their own social graph growth due to their USP. I don’t think this is insurmountable, just something to be careful of when implementing.)

If the default algo/behavior is allowed to persist, it's going to be effectively no real change.

Drop the algorithm altogether? I subscribe to channels for a reason.

How do you prevent a Cambridge Analytica exfiltration situation with third party algorithms?

And how does this prevent addictive algorithms which will win through social selection?

  • The Cambridge Analytica stuff never got fixed, it just got hidden out of sight. The situation is worse than ever now.

That’s like saying the solution to cigarettes is that tobacco shops must be forced to sell clove cigarettes as a not-addictive alternative.

Yes please. Algorithms should be plug-in-and-play and not endemic to the app. You should be able to take popular algorithms and plug them into any app

  • That's just laundering the bad actions though a third-party.

    The winning third party algorithm will be the one that gives people the same rush the first party algorithms currently do, because people will use it for the same reasons; they get to see cute AI animals do crazy things forever.

    • Nah. The main issue of addiction is the lack of clarity. You allow an addiction to pretend like it has a purpose and it can stick around. Reductionism is a great tool against this: reduce your addiction down to its most clear state and it loses all the mystique. Right now, people can pretend like they're into tiktok or youtube or instagram or twitter etc because they wanna engage in the social media landscape. Pull out the algorithm and replace it with a different one and they can't keep that lie up. They have to admit they're into the dopamine itself.

Virtually nobody would choose to pay a subscription for the non-addictive app version, and I'd even say this suggestion is a bit insulting to anyone who isn't high-income.

  • I will never pay a subscription for the current clickbaity slop. I might if the algorithm were better, closer to YouTube of 10 years ago, when it would suggest lectures, artfully done film shorts, and overall more interesting, high quality content.

    • 10 years ago the most popular 100 videos on YouTube were all pop music videos. Justin Bieber had 3 of the top 10.

      The youtube algorithm has been personalized for much more than 10 years and has never prioritized any kind of lectures or artful films over anything else it thinks a viewer will watch. You're asking for them to bring back an era that never existed.

      If you're not getting those sorts of recommendations it's because you ddon't actually watch that kind of content, or you're removing your history.

      1 reply →

Seriously? You think they should allow random third parties to inject code into their platforms with all the possible security risks? Regardless the intent that is a terrible idea.

Or algorithms have to be submitted and approved by a government body before being allowed to be implemented and are frequently audited

  • I guess this is the only way. I don't think we need novel approach and I don't consider this a novel one since we already have government agencies verifying approved processes in other areas so why not content distrubution.

The only solution is to outlaw all recommendation algorithms. Accounts should only have access to a chronological feed they choose to follow. The host can promote whatever they want, but it has to be the same promotions for everybody.

  • I like recommendation algorithms. If someone on my friends list posted about a major life event a few days ago and I haven't seen it yet then I want that prioritized first, before more recent posts. Chronological feeds should be an option for those who want them but they shouldn't be forced on anyone.

I think a better solution would be to repeal section 230 protection for any kind of personalized or algorithmic feed. The algorithm makes you a publisher, and you should be liable for what you publish.

That would make it very hard, nigh impossible, for a platform like YouTube or TikTok to exist as it does today, and would instead favor people self-curating mechanisms like RSS readers etc.

  • >and would instead favor people self-curating mechanisms like RSS readers etc.

    That isn't what would happen.

    What would happen is that only the platforms which can afford legal teams - in other words, the big platforms - would host user posted content under strict arbitration only terms, and every other platform (including Hacker News, which uses an algorithmic feed) would simply not. Removing one of the cornerstones of free speech on the web in favor of regulation will only centralize the web more.

    And you wouldn't see mass adoption of "self curating mechanisms" because most people aren't like Hacker News people and would find the premise of having to manually curate data feeds from every they visit to be a tedious waste of their time.

    I also think that platforms like Youtube and Tiktok shouldn't be illegal. I don't even think that personalized algorithms should be illegal - it's surprising that one has to point this out on a forum of programmers - but algorithms have no inherent moral dimension and the ability to use an algorithm to find and classify relevant content can be useful. The same algorithm that surfaces extremist content surfaces non-extremist content. The algorithm isn't the problem, rather the content and the policies of these platforms are the problem. And I don't think the solution to either is de facto making math illegal and free speech more difficult.

  • How is RSS self curating? It's just a way to get a feed from somewhere. And under the maximally external-locus-of-control culture this jury is using, those feeds would themselves be deemed evilly addictive.

    There is no solution for this kind of verdict beyond appeal, or changes to the law to rule such suits out, because it's not rooted in any logical or legal principle beyond the idea that people should not be responsible for their own actions (or their children's actions). But there's no limiting factor to that belief. You can't fix it with RSS or federation or making people select who they follow or chronological feeds. Those would just get blamed for "addiction" instead.

    • Each blog you follow in the RSS model you opted in to. And each post comes from a person, or a publication, who can be held accountable for what they publish.

      Ordinary media, like newspapers, books, radio, and TV, have worked this way forever — people publish “channels” and you decide what channels to follow. A channel can be held accountable.

      The algorithm model is different. People just publish “content” into the platform, and the platform makes a custom channel for each viewer, inserting content from people you’ve never heard of and didn’t ask to follow. And it optimizes that custom channel for whatever addicts you the most. That’s fundamentally a different beast than opt-in media consumption.

      3 replies →