← Back to context

Comment by vermilingua

6 days ago

Hmm, where have I seen this before…

https://en.wikipedia.org/wiki/Accelerando

One of the most interesting things about the book is how it skewers the idea of there being a singular AI.

In Accelerando the VO are a species of trillions of AI beings that are sort of descended from us. They have a civilization of their own.

  • That description leaves off some of the flavor— changes the feel once you know the descendent were labeled the Vile Offspring by the main characters, that were still human-ish.

For better or worse, Accelerando comes to mind _a lot_ when trying to figure out how agent platforms should work. Hopefully for better.

It's not gonna happen like that, because all the paths leading to it cannot be financially exploited.

Also what a shortsighted scifi book, yet techies readily invest in that particular fantasy because it's not your usual spaceship fare.

  • > Also what shortsighted scifi book

    It's art not oracle

    • Some people will call BS on books like that until every detail down to getting their own AIneko cat, and start again when the Wunch start eating their uploaded thought space via man-in-the-middle exploits that the protocol spec in the book was inaccurate

Here's another one: Manna - Two Views of Humanity’s Future, by Marshall Brain. It's a fairly light read, just 8 chapters:

https://marshallbrain.com/manna1

I think I've internalized these stories enough to comfortably say (without giving anything away) that AI is incompatible with capitalism and probably money itself. That's why I consider it to be the last problem in computer science, because once we've solved problem solving, then the (artificial) scarcity of modern capitalism and the social darwinism it relies upon can simply be opted out of. Unless we collectively decide to subjugate ourselves under a Star Wars empire or Star Trek Borg dystopia.

The catch being that I have yet to see a billionaire speak out against the dangers of performative economics once machines surpass human productivity or take any meaningful action to implement UBI before it's too late. So on the current timeline, subjugation under an Iron Heel in the style of Jack London feels inevitable.

  • > take any meaningful action to implement UBI

    I hear this all the time, but to what end? If the input costs to produce most things ends up driving towards zero, then why would there be a need for UBI? Wouldn't UBI _be_ the performative economics mentioned?

    • I think of it like limits in math. The rate at which we'll be out of work is much higher than the rate at which prices will fall towards zero.

      A performative/underemployment economy keeps everyone working not out of necessity, but to appease the sentiments of the wealthy. I'd argue that we passed the point at which wages were tied to productivity sometime around 1970, meaning that we're already decades into a second Gilded Age where wealth comes from inheritance, investment and connections (forms of luck) rather than hard work.

      And honestly, to call UBI performative when billionaires are trying to become trillionaires as countless people die of starvation every day just doesn't make any sense.

Isn’t that the one where corporate structures become intelligent self executing agents, cause a lot of problems? Yet here IRL, the current tech billionaires think it’s a roadmap to follow?

Talk about getting the wrong message. No one show those guys a copy of 1984! Wow, then…