Comment by gyomu
18 hours ago
It's really hard to extract computing from the capitalistic, consumerist cradle within which it was born.
Every other human creative practice and media (poetry, theater, writing, music, painting, etc) have existed in a wide variety of cultures, societies, and economic contexts.
But computing has never existed outside of the immensely expensive and complex factories & supply chains required to produce computing components; and corporations producing software and selling it to other corporations, or to the large consumer class with disposable income that industrialization created.
In that sense the momentum of computing has always been in favor of the corporations manufacturing the computers dictating what can be done with them. We've been lucky to have had a few blips like the free software movement here and there (and the outsized effect they've had on the industry speaks to how much value there is to be found there), but the hard reality that's hard to fight is that if you control the chip factories, you control what can be done with the chips - Apple being the strongest example of this.
We're in dire need of movements pushing back against that. To name one, I'm a big fan of the uxn approach, which is to write software for a lightweight virtual machine that can run on the cheap, abundant, less/non locked down chips of yesteryear that will probably still be available and understandable a century from now.
you can only blame capitalism so much for the unpopularity of hypercardlike things vs instagram/facebook/twitter etc
on some level it is just human nature to want to consume than create. just is. its not great but lets not act like people havent tried to make creative new platforms for self expression and software creation and they all kinda failed
> is just human nature to want to consume than create
That may be true.
But it doesn't really explain why the tools for simple popular creation are not there. There are a lot of people in the world who would use them, even if its only 1%.
Part of the problem trying to isolate computing is that it's fundamentally material. Even cloud resources are a flimsy abstraction over a more complex business model. That materialism is part of the issue, too. You can't ever escape the churn, bit rot gets your drives and Hetzner doesn't sell a lifetime plan. If you're not computing for the short-term, you're arguably wasting your time.
I'm not against the idea of a disasterproof runtime, but you're not "pushing back" against the consumerist machine by outlasting it. When high-quality software becomes inaccessible to support some sort of longtermist runtime, low-quality software everywhere sees a rise in popularity.