← Back to context

Comment by pixelfarmer

3 days ago

There is no system that fulfills your requirements.

It is even easy to explain why: Humans are part of all the moving pieces in such a system and they will always subvert it to their own agenda, no matter what rules you put into place. The more complex your rule set, the easier it is to break.

Look at games, can be a card game, a board game, some computer game. There is a fixed set of rules, and still humans try to cheat. We are not even talking adults here, you see this with kids already. Now with games there is either other players calling that out or you have a computer not allowing to cheat (maybe). Now imagine everyone could call someone else a cheater and stop them from doing something. This in itself is going to be misused. Humans will subvert systems.

So the only working system will be one with a non-human incorruptible game master, so to speak. Not going to happen.

With that out of the way, we certainly can ask the question: What is the next best thing to that? I have no answer to that, though.

Cheating happens in competition based systems. No one cheats in games where the point is to co-operate to achieve some common goal. We should aim to have a system based on recognizing those common goals and enabling large scale co-operation to achieve them.

  • > co-operate to achieve some common goal.

    all systems are competitive, if the system involves humans - after all, even in a constrained environment like academia, where research is cooperative, the competition for recognition is still strong. This includes the order of the authorship presented in the paper.

    What you're asking for, regarding cooperation to achieve common goals, is altruism. This does not exist in human nature.

    • Academia is competitive because it's designed to be competitive. If things like funding, recognition and opportunities go to "winners", people will try to win. It's possible to design systems that do not force people to compete. For example you could take away the names from papers and assign funding randomly/semi-randomly and the competition would end. Then add some form of retroactive funding (or other kinds of rewards) that's awarded to research that has produced useful results, and you'll get your incentive to do good research without the need for competition.

      It's harder to design systems that avoid competitive behavior, but I don't think it's impossible. And of course competition is not all bad, it's a good tool when used carefully. But it's way too much when most of our systems are based on it.

      1 reply →

> What is the next best thing to that? I have no answer to that, though.

i argue that what we have today is the so called next best thing - free market capitalism, with a good dose of democracy and strong gov't regulations (but not overbearing).