← Back to context

Comment by Leomuck

12 hours ago

So basically more ways of trying to make people buy things, do things, think things than before? I feel like our whole world more and more circulates around manipulation and the absence of truth and discourse.

Then again, I do think LLMs are an incredible technological achievement. The issue is not so much what they do or that they exist, but how they are utilized. Right now, they are utilized to further the class divide between rich and poor.

Who are we to trust in the future? Not big companies, not the state, not LLMs. Time to organize around groups and collectives that we know we can trust and that we know have our wellbeing in mind.

> The issue is not so much what they do or that they exist, but how they are utilized

This is exactly how we got here though. Technology is not passive. It changes incentives, procedures, ideas and shapes the world. If we don't structurally limit what and how it's used, then we are not in control, no matter what are choices personally are.

  • A major problem is that if we structurally limit what technologies do, we are still not in control. Now whoever we empowered to control and limit the technology is in control. Who keeps them accountable?

    You’ll probably get one of three outcomes: regulatory capture by monopolies, self dealing by bureaucrats to enrich themselves or gain power, or regulatory capture by self absorbed ideologues who halt all progress or force it down some ideologically approved path.

    In none of those scenarios is anything aligned with the best interest of the people.

> I feel like our whole world more and more circulates around manipulation

Hate to break it to you but it's always been this way, and it was easier in the past when information was so much more expensive to distribute.

  • Cheap distribution makes manipulation easier, not harder. The time it takes for a bad actor to capture attention is much smaller now. It used to be you'd read one news article about a story once a day in a newspaper and maybe once more in the evening news. In between you could think about it, talk about it with other people you know in real life, etc.

    Now you're getting meme after meme of the same story multiple times throughout the day, twisted in so many ways. And since we all have our feeds/algorithms adjusted to suit our own tastes we're all getting our own silo'd view of things and can barely rely on a shared set of facts.

    All that makes manipulation of people much easier.

    • And the memes are all just one sentence gotchas with no real substance. And that's how people ingest the news. Headlines and Memes. Who needs actual articles?

  • > it was easier in the past when information was so much more expensive to distribute.

    As a 52 year old my life experience disagrees.

    It is much easier now because information flows both ways and "They" have a lot of information on you (and everyone else) and can use that information to manipulate you with algorithmic ragebait, and to extract maximum rents (in all aspects of commerce, not just literal rents) from you, etc.

    Not that things were ever perfect in the past, they certainly weren't, but increasingly so much of everything is literally just an outright fucking scam these days and all of it is being turbocharged by various forms of "AI" adjacent technology and increasing deregulation.

> Time to organize around groups and collectives that we know we can trust

I’ve had the same thoughts, but if you look deeper, it all circles back to what we already had: (open, transparent) public institutions, society, and government by the people. The foundation wasn't the problem; the environment was.

Along the way, social media noise, engagement-optimisation and Kardashian-style "entertainment news" infecting real news made an attention economy where, no matter how scandalous you are, attention can be minted into dollars. That is what polluted our infosphere and lead to the lack of trust.

Now, nobody trusts these previously mentioned public entities any more - sometimes due to state-actor or ad-tech disinformation, and sometimes for good reason like when the poisoned public allowed these 80s-style telemarketer-style political weirdos and their cronies to take over public administration.

Our society, pre internet, built systems to manage trust. The conditions that allowed those systems to exist (the speed of transmission of data, the ratio of content generation to verification, the ability to shape consensus), have changed.

You are ringing the clarion call for community and cooperation, and it will not work. Not because people don’t want community or the better things, but because incentives make the world go round.

The choice between making some money at the cost of polluting the information commons is no choice at all. That degradation of the commons means no one can escape. No community you form, no group you build, dodges the fallout when someone decides to set fire to shared infrastructure.

We are moving into the dark forest era of the information economy. As models improve, inference costs drop, and capacity increases, the primary organism creating content online will be the bot.

Instead of building communities of people, build collections based on rules of engagement. Participants - be it bots or humans - must follow proscribed rules of conflict and debate.

That way it doesn’t matter if you are talking to a machine or a person. All that matters is that the rules were followed.

Local models and powerful consumer HW and an informed populace that doesn't hate STEM, but that's not good for the shareholder value so you get expensive everything everywhere all at once instead. And if you dare question the mindset of hating on STEM whilst being addicted to its fruits, that just means you're another one of those maximally SV-aligned sociopaths so why bother? Evolve and let the chips fall where they may because I don't see any other options that play out in the idiocracy craving for strong confidently wrong leadership.

> Right now, they are utilized to further the class divide between rich and poor.

Ironically this was the main reason LLMs were introduced in the first place, not to benefit the poor, but to widen the gap between the rich and the poor.

The majority of human history has been written by the ruling class of the day. Transparency only seems to follow in the wake of their inevitable fall, usually at great cost in retrospective research via the oft thankless unraveling of threads of truth from their more copious fictions. Much like the machines we construct in our likeness, we too seem to get stuck in endless regressive cycles.

Folks in the "now" have always had a tendency to cling to their fictions as if they were truth for whatever reason; like nationalist exceptionalism, racial superiority, or religions rooted in "othering", etc. Humans seem to have an innate desire to fool themselves and trust in things they should not. Perhaps it's simply a sort of existential coping mechanism of living in a cold, unforgiving reality. We seek the comfort of lies.

Organizing around groups of trust, tends to lead to factionalism and conflicts. Knowing and trusting are sadly very different things in our species.

The Old Internet was a whalefall - Information online was fairly trustworthy while being more convenient and more plentiful than in-person information.

The whale's been eaten now. The broader Internet is mostly not trustworthy, or convenient, and the information is not even very plentiful.

People will and are retreating into high-trust zones. In-person networks, product recommendations from real friends, and closed group chats.

It's not the end of the world, but things have changed. We'll have to put more work into finding information than we're used to.

Self inflating nipple shaped balloons that generate their own lift without any helium would be an incredible achievement but that doesn't mean it's useful beyond being novel. Chatbots are ultimately just predictive text on steroids, and only complete fools would base their business, or entire economy around it.