Comment by solid_fuel
2 days ago
> The problem is that having an incentive to hide harms is being used as evidence for the harm, whether it exists or not.
No, the incentive to hide harm is being given as a reason that studies into harm would be suppressed, not as evidence of harm in and of itself. This is a direct response to your original remark that "Part of me thinks that if the case against social media was stronger, it would not be being litigated on substack."
Potential mechanisms and dynamics that cause harm are in the rest of my comment.
> Harming your users seems counterproductive at least to some extent.
Short term gains always take precedence. Cigarette companies knew about the harm of cigarettes and hid it for literally decades. [0] Fossil fuel companies have known about the danger of climate change for 100 years and hid it. [1]
If you dig through history there are hundreds of examples of companies knowingly harming their users, and continuing to do so until they were forced to stop or went out of business. Look at the Sacklers and the opioid epidemic [2], hell, look at Radithor. [3] It is profitable to harm your users, as long as you get their money before they die.
[0] https://academic.oup.com/ntr/article-abstract/14/1/79/104820... [1] https://news.harvard.edu/gazette/story/2021/09/oil-companies... [2] https://en.wikipedia.org/wiki/Sackler_family [3] https://en.wikipedia.org/wiki/Radithor
>No, the incentive to hide harm is being given as a reason that studies into harm would be suppressed, not as evidence of harm in and of itself. This is a direct response to your original remark that "Part of me thinks that if the case against social media was stronger, it would not be being litigated on substack."
That seems like a fair argument. I don't think it means that it grants opinions the weight of truth. I think it would make it fair to identify and criticise suppression of research and advocate for a mechanism by which such research can be conducted. An approach that I would support in this area was a tax or levy on companies with large numbers of users that could be ear-marked for funding independent research regarding the welfare of their user base and on society as a whole.
>Short term gains always take precedence.
That seems a far worthier problem to address.
>If you dig through history there are hundreds of examples of companies knowingly harming their users
I don't deny that these things exist, I simply believe that it is not inevitable.
> That seems a far worthier problem to address.
If we can't fix the underlying problem immediately, treating the symptoms seems reasonable in the meantime.