← Back to context

Comment by ranyume

1 day ago

> Monitoring children's DMs is the responsibility of the parents, not megacorps

Absolutely. But what responsibilities do megacorps have? Right now, everyone seems to avoid this question, and make do with megacorps not being responsible. This means: "we'll allow megacorps to be as they are and not take any responsibilities for the effects they cause to society". Instead of them taking responsibilities, we're collecting everyone's data and calling it a day by banning children from social networks... and this is because there are many interests involved (not related to child development and safety).

> But what responsibilities do megacorps have? Right now, everyone seems to avoid this question

Clear, simple, direct: Whatever was required of The Bell Telephone Company and nothing more.

  • So there should be a human operator manually gatekeeping every individual request to connect with another endpoint?

    It's a good thing those human operators couldn't listen in to whichever conversation they wanted.

    • Human operators were not required of The Bell Telephone Company by law. Bell switched to mechanical switching stations as soon as doing so was economically advantageous.

      (Reconsider my post. I'm arguing for no regulation.)

      2 replies →

  • I'd say that at minimum social networks need to be required to show how their algorithm works and allow users control over their data. They must be able to know why a content was served to them. Nowadays social networks are so pervasive in society, affecting it and molding it to unknown interests, that this is the bare minimum for a free society.

    Ideally, users should be able to modify the algorithm, so they can get just what they want, while simultaneously maximizing free speech. If something isn't illegal, it shouldn't be hidden or removed.

    • > Nowadays social networks are so pervasive in society, affecting it and molding it to unknown interests

      I think this is the real issue. We should free ourselves from "social networks" such as Tiktok, Facebook, Instagram and others. Even with direct messages truly E2EE, they create countless other privacy problems. They enable surveillance of people at scale and should be completely shunned for that reason alone.

      1 reply →

    • > social networks need to be required to show how their algorithm works

      Hypothetically speaking: What if it's a neural network in which each user has his/her own unique weights which are undergoing frequent retraining?

      Would it not be an undue burden to necessitate the release of the weights every time they change?

      Also, what value would the weights have? We haven't yet hit the point of having neural networks with interpretability.

      Wouldn't enforcing algorithmic interpretability additionally be an undue burden?

      > They must be able to know why a content was served to them.

      What if the authors of the code are unable to tell you why?

      2 replies →

  • I don’t remember reading about ads in phone calls, nor the complete mapping of customers behaviors to use in contexts not being the phone call.

    The apples to oranges in this comparison is probably top five on HN ever.

    • > nor the complete mapping of customers behaviors to use in contexts not being the phone call

      This is because the telephone system was regulated with wiretapping laws, among others.

      > I don’t remember reading about ads in phone calls

      See above, but also: Junk Faxes & Telemarketing/Robocalls.

      > The apples to oranges in this comparison is probably top five on HN ever.

      It all comes down to whether you view social media as a communications platform or a publishing platform.

      The strict regulations governing the telephone system (see above) would adequately cover both with deliciously-antique turns of phrase like "wiretap" and "pen register".

  • Whatever was required of the new york times and nothing more.

    If the NYT publishes and advert or editorial, it's held accountable for the contents.

    • Touché!

      The question is: Are social media services more similar to communications platforms or publishing platforms?

      My reply obviously treats them like the former and yours like the latter.

> But what responsibilities do megacorps have?

fake and scam AD.

they literally profit from those ADs. When the AD distributes malware or make scam, they don't take any responsibility

> But what responsibilities do megacorps have?

They should have a responsibility of transparency, accountability and empathy towards users. They should work for the user and in the interests of the user. But multiple constraints make this impossible in practice.