Comment by jgeada

1 year ago

It is the incentives.

Maybe you're moral and keep to the straight and narrow. However, the system hires all types and the ones that just follow the incentives will do better. They get promoted, have more power, hire more people like them. Eventually the moral types will just be the exception and no longer affect the average.

Incentives exist because they change the behavior of the whole; they work as intended. Just that what is intended isn't always desirable or even a good idea.

No, it's people being unwilling to accept agency and own up to their faults. If we're evaluating a system of rules and how to improve them, trying to understand perverse incentives is important. But it is not an excuse for any individual who violates moral standards to do so because they had an incentive.

  • The problem is that much perverse behavior is not inherently immoral. Not all forms of p-hacking are unjustifiable, nor are attempts to increase citations through publication fragmentation, nor is appealing to grant providers with research that you consider duplicative. But all of these things make science worse.

People try to maximize the good and minimize the bad consequences of their actions. They might not do it using utils or with actual quantification but they are doing it. And definitionally, there's no way to get rid of an incentive to defect, because getting rid of it creates a new incentive to defect in a different way. Like, for the purposes of talking about this article, "incentive" could be shorthand for "any reason you could come up with to do something wrong to get ahead" but it could also more broadly be defined as "the expected good results of a choice". As an example, as long as money is important in society, there is always going to exist an "incentive" to rob a bank. That can't be removed. What we can do is make it harder to rob a bank, and force reputational damage and jail to thieves. Creating a society where money doesn't matter might be possible, but then there'd be no bank. By the same token, there will always be an incentive to fake data. We can make it harder to fake data and force reputational damage to people who fake data, but that incentive to fake will exist. The only way that it wouldn't exist would be if we made it so that the outcomes of research didn't matter at all, but it would be hard to imagine a society functioning where any research would be happening if no outcomes mattered. If that were the case, then high school dropouts would try to get research grants for baking soda and vinegar volcanoes. The only way to prevent that would be to create a system where people have to justify their research without caring about the results, but then you've reintroduced "incentives", just different ones that can still be cheated again.

By arguing that it's the moral character of people that's the problem and not the mere incentives, one key disincentive is reintroduced which is the reputational damage thing I alluded to earlier. Most people don't rob banks not because there's no incentive, but because the disincentive (jail, reputational damage) is so high as to make that course of action seem stupid. But if you argue that it's incentives and not moral character to blame, you remove the disincentive of making defectors suffer reputational damage. You can't remove an incentive entirely. You can only change them, and add disincentives. Reputational harm is one of those disincentives, and so is forcing things like pre-registering experiments, open access journals, etc.

I agree with you. It's the law of large numbers; individuals make free choices, yet in the aggregate the incentives make the likelihood of decisions fairly predictable. This is the essence of "nudging" and decision architecture (and dark patterns, etc.).

I suggest reading The Selfish Gene. A system where only the selfish can thrive is shown to collapse with certainty, no matter how many times one runs the experiment.

  • This idea that we can just setup a good system with good incentives falls flat on its face before we even need to consider its merits because the problem is that there are always snakes involved in the development of these systems. The most important thing for societal survival is moral character AND THEN the system itself.

  • So your argument is that capitalism is not about selfishness but mutual beneficial trades, as otherwise it would have collapsed by now? If you mean it is just a matter of time, every system will collapse sooner or later so that doesn't say much.

    • I'm not sure how capitalism crept into the discussion, since neither the article nor my comment and its parents touch on it.

      But since you have, absolutely, capitalism only works well when there are guardrails in place. Our greatest success is creating a society based on the adherence to law and order that preserves individual freedoms. If it solely relied on capitalism, it would become possible for a person to pay someone to kill someone else. Maybe we'll get there eventually once enough safeguards are eroded, who knows?

We can't just say it's the incentives. Morality is always a choice. What people who make amoral or immoral choices skate on is that their morality will not be exposed or questioned publicly or privately. We afford this to all because it's impolite to question someone's morality, extremely rude. You don't know what they've been through and you don't know what their life circumstances are. This is what puts us in this scenario where no one's morality can be called out in situations where there is obviously a moral dilemma. It's not an easy answer, but the only solution to this is courage. You will not make friends doing this so really only people who have "made it" or just straight DGAF will take up this sword.

I remember in college just about every student was sharing exams and cheating. I just didn't do it, and I got shittier grades. Life is full of a lot of people like this, no pun intended, it's demoralizing. I wonder if it's polite to say that to others, just a "hey you are literally demoralizing me, it's toxic".

Shame, or its avoidance, is an incentive.

When researchers become shameless cheats science suffers.

The depressing conclusion is that we need to change the incentives to work in a world without shame. This may work to some extent, but the result will certainly be worse than a world in which most researchers try to do the right thing.

Its funny, the whole presenting ones self as an objective beacon of morality is a response to an incentive system in itself. For whatever reason; parents, school the author was convinced that this type of morality would yield a better life or piece of mind or some optimal outcome.

  • > convinced that this type of morality would yield a better life or piece of mind or some optimal outcome

    Yeah. For example a society where people work together for the benefit of all, instead of having some people exploit the others.

    • It's interesting that this concept of ethics evolved at all.

      "Nature is red in tooth and claw." It's a brutal, heartless competition for resources and survival.

      And yet through the vicious process of evolution, we developed empathy and a sense of fairness. Those instincts must have some cold, rational benefit for the survival of the group.

      4 replies →

And, even moreso for the short term.

If the scientist in question doesn't get found out for 30 years but then becomes a pariah, it doesn't matter. They displaced a better scientist for their entire career. There is no retroactively fixing that.

There is a huge incentive to cough up something that will make you "famous". If it doesn't make you famous, well, you can bury it and simply be a pedestrian scientist--no harm, no foul. If it does make you "famous", well, you might make it out the other side without anybody being able to pin anything decisive on you. And, if you get caught and become a pariah in 10 years, well, you likely earned way more than you would have in 30 years anyway.

Lying, in this case, almost always comes out ahead.

In systems that quickly filter out most people (like Academia) the incentive structure is even more selective.

So maybe it’s how limited the resources are (limited grants, limited tenure positions) more so than the incentives.

We can either accept that there are incentives, but ultimately people are responsible for their [in]actions and should be lauded or punished accordingly, or what? We are but powerless automatons thrust into this System that requires us to... falsify scientific data in order to get tenure?

No, sorry. We have free will. We have agency. Using an example from the article, there are approximately 0 examples of professors who eschew open source journals because of the low impact factor until they get tenure then exclusively publish in open source journals once they get it. There's always some other faculty position, some other grant, some citation, some conference. They just don't want to publish in open source journals because closed source journals are "better." That's fine, but don't be so dishonest as to pretend it's the incentives that pushed you there. It's you.