Comment by madaxe_again
10 years ago
Complexity and complacency kill. As you say, everything moves in cycles, and while it's easy to say that we "make the same mistakes", the reality is closer to "the same inevitable forces take effect".
Our societies are driven by emotion, and by reactionary forces. Revolution leads to expansion, which leads to consolidation, which leads to narcissism, which leads to complacency, which leads to inner turmoil, which leads to revolution.
I've recently re-read Asimov's Foundation series end-to-end, and his ideas of how an empire rises and falls, while quite clearly based on Rome, hold very true, particularly in today's world.
All we have now will ultimately end in ashes, and assuming we don't end up with an uninhabitable planet, something new, and marginally better, will grow from those ashes - but it will take time, and the setbacks will be enormous.
As long as we have humans making decisions, the same ideas will take root, the same feelings will be in effect, the same dichotomies of "us" and "them" will arise, and the instinct to hoard for winter will never subside.
The only way out is probably strict technocracy run by benevolent AIs - but it's not one that anyone is going to be willing to accept, as by doing so one loses much of what it is to be human.
Ultimately, you have to accept the bad with the good, enjoy what little life you have, and make your goal maximising the happiness of yourself and others - as there's quite literally nothing else to do.
> Complexity and complacency kill.
Clearly it's not as simple as "complexity is bad." Ask any engineer: complexity often enables performance and efficiency. A modern V6 is far more complex than a 1960's car engine, but smaller, lighter, more powerful, and more fuel efficient. Complexity is just a fact of life that needs to be managed.
Indeed, the story of our civilization is progress driven by ever-increasing complexity, facilitated by technological innovation. Amazon is far more complex than a local mom-and-pop. It's also far more effective at getting products to people quicker and more cheaply. Going back in history, Great Britain's financial system was vastly more complex than what contemporary countries had, but it enabled the tiny nation to build and control a vast economic empire.
Essential complexity is not bad; accidental complexity is.
The complexity of something that overlays or models a system should reflect the complexity of that thing.
Complication kills, not complexity. Complex systems are made up of simple independent pieces. Those pieces work together naturally often leading to new behaviors. Because they are independent, they can also adapt easily to changes in the environment.
Complication is different. Complicated systems have dependencies that are more tightly bound together. Because of this, they can't adapt as easily to environmental changes. Consequentially, this creates conflict between system and environment that can often be destructive or violent.
Or at least that's my social hypothesis. I based my idea on the Rich Hickey "Simple vs. Easy" talk on software complexity. I think his ideas could be generalized to societies as well.
https://www.youtube.com/watch?v=rI8tNMsozo0
>The only way out is probably strict technocracy run by benevolent AIs - but it's not one that anyone is going to be willing to accept, as by doing so one loses much of what it is to be human.
This is along the lines of Plato's philosopher kings, who he insisted were the only hope. This is fascist crap. Benevolent Philosopher kings don't exist, and neither do benevolent AIs.
There was, in fact, rule by philosopher kings in Plato's generation, by a gang of thirty who learned from Socrates. Unsurprisingly, they were brutal fascists. An AI regime would be the same.
This is because there is only one way to ensure human needs are met, and it isn't technocratic anything. It is to ask people and empower them to get what they need directly. I.e. democracy, in as direct and total a form as we can manage efficiently.
One of the Thirty was a student of Socrates, not all thirty, and none of them were philosopher kings. They were a puppet government installed by the Spartans, which is why Socrates was amongst the people who opposed them. The fact that Critias was among the Thirty was used by Socrates' enemies to slander him but he had lots of pupils among the youth of Athens all of whom had different reasons for listening to his lectures.
The slander was justified, in my opinion. Critias was not merely 'among' the Thirty, he led them. And Socrates was not exactly a democrat anyway.
In any case the point is: no hyperintelligent monarch is going to appear to introduce a reign of reason that will save us all.
> This is because there is only one way to ensure human needs are met, and it isn't technocratic anything. It is to ask people and empower them to get what they need directly. I.e. democracy, in as direct and total a form as we can manage efficiently.
Which would work, except that what one person needs or wants does not align with what the other needs or wants. If you'd then purpose to 'weigh' the needs and wants against each other I'll point out that there is no way to do such a thing. Since all human judges of what is ethical are biased by everything they are.
Supposing a AI regime could be established it'd be so powerful as to completely annihilate proponents of values that it considers inferior and thereby creating a 'perfect' society from a certain point of view...
>If you'd then purpose to 'weigh' the needs and wants against each other I'll point out that there is no way to do such a thing.
Sure, we have many tools for achieving this. Compromise, conversation, empathy. We've been using these tools to solve our differences for as long as there have been humans.
> It is to ask people and empower them to get what they need directly. I.e. democracy
You've simply replaced one unproven ideal with another here.
How is democracy an 'unproven ideal'? It has thousands of years of history across many human cultures.
1 reply →