Comment by BurningFrog
5 hours ago
The west didn't get rich from colonizing the world.
It got rich domestically through industrialism. Then the newly rich countries went on to colonize the world, because now they could. If and how much the colonies made them even richer is debatable, but it was probably a net cost on average.
This is one of several insights counter to "common sense" that economists have figured out.
> If and how much the colonies made them even richer is debatable, but it was probably a net cost on average. This is one of several insights counter to "common sense" that economists have figured out.
I haven't heard this before, do you have sources where I could learn more?
of course he doesn't because he just made it up
There are scientists that reach quite different conclusions from you.
https://www.jasonhickel.org/research
Let me just add that the colonizing of the Americas in the 1500s was of course unrelated to industrialism emerging centuries later. Much of that was an accident of immunology.
Note that industrialized countries without colonial empires ended up at least as rich as the big European colonial powers.