← Back to context

Comment by Fiveplus

2 days ago

Reading this felt like the official obituary for the 90s techno-optimism many of us grew up on.

The "end of history" hangover is real. We went about building the modern stack assuming bad actors were outliers, not state-sponsored standard procedure. But trying to legislate good use into licenses? I don't know how you would realistically implement it and to what extent? That solution implies we have to move toward zero-trust architectures even within open communities.

As an example: formal proofs and compartmentalization are unsexy but they're a solid way we survive the next decade of adversarial noise.

I remember reading a quote somewhere that stuck with me. Paraphrasing, "If the architecture of my code doesn't enforce privacy and resistance to censorship by default, we have to assume it will be weaponized".

I am out of ideas, practical ones, lots sound good on paper and in theory. It's a bit sad tbh. Always curious to hear more on this issue from smarter people.

"If the architecture of my code doesn't enforce privacy"

This is still techno-optimism. The architecture of your code will not to that. We are long past the limits of what you can fix with code.

The only action that matters is political and I don't think voting cuts it.

  • Yeah, reminds me of the "Security" xkcd (https://xkcd.com/538/) - a threat from a good ol' 5-dollar wrench defeating state-of-the-art encryption.

    Never estimate how state actors can use violence (or merely the threat of it) to force people to do things. The only way to respond to that is not through code or algorithms or protocols, but through political action (whether it be violent or non-violent)

  • > We are long past the limits of what you can fix with code.

    example of what is not possible to fix with code?

    • Hardware? The real world? Pretty much everything?

      Power. Real power. The power to kill you, take your property, harm your family, tell lies about you on the news, etc.

      I've always been surprised by the naivety of tech people with respect to this question. The only possible solution to power is power itself. Software can be a small part of that, but the main part of it is human organization: credible power to be used against other organized holders of power. No amount of technology will let you go it alone safely. At best, you may hope to hide away from power with the expectation that its abuse will just skip over you. That is the best you could hope for if all you want are software solutions.

      1 reply →

> trying to legislate good use into licenses

It's also questionable to which extent restrictive licenses for open source software stay that relevant in the first place, as you can now relatively easily run an AI code generator that just imitates the logic of the FOSS project, but with newly generated code, so that you don't need to adhere to a license's restrictions at all.

Perhaps we need reputation on the network layer? Without it being tied to a particular identity.

It would require it not to be easy to farm (Entropy detection on user behaviour perhaps and clique detection).

  • How does one make sure the implementation is sufficient and complete? It feels like assuming total knowledge of the world, which is never true. How many false positives and false negatives do we tolerate? How does it impact a person?

    • I'm not sure. We can use LLMs to try out different settings/algorithms and see what it is like to have it on a social level before we implement it for real.

      2 replies →

> If the architecture of my code doesn't enforce privacy and resistance to censorship by default

which is impossible.

- No code is feasibly guaranteed to be secure

- All code can be weaponized, though not all feasibly; password vaults, privacy infrastructure, etc. tend to show holes.

- It’s unrealistic to assume you can control any information; case-in-point the garden of Eden test: “all data is here; I’m all-powerful and you should not take it”.

I’m not against regulation and protective measures. But, you have to be prioritize carefully. Do you want to spend most of the world’s resources mining cryptocurrency and breaking quantum cryptography, or do you want to develop games and great software that solves hunger and homelessness?

  • No code architecture will enforce privacy or guarantee security.

    Some code architectures make privacy and security structurally impossible from the beginning.

    As technologists, we should hold ourselves responsible for ensuring the game isn't automatically lost before the software decisions even leave our hands.

"As an example: formal proofs and compartmentalization are unsexy but they're a solid way we survive the next decade of adversarial noise."

I think you are on to something there. Noise is really signal divorced from tone. Our current consensus protocols are signal based. They demonstrate control, but not rightful ownership. Pairing a tone keypair with a matching signal keypair in a multisig configuration would be compatible with current networks, but also allow a bottom-up federated trust network to potentially emerge?

There are no technology solutions to what are fundamentally limits of the human society

We reached the limits of societal coherence and there’s no way to bridge the gap

Things like that should not be handled on software level, you will always loose and run out of resources. You basically have to force politicians (fat chance)

  • Politicians aren't generally leaders, but rather followers. To force politicians to do something, lead where people follow you. But of course, paradoxically, this will by definition make you a practitioner of politics yourself... To quote from The Hunt for Red October, "Listen, I'm a politician, which means I'm a cheat and liar. When I'm not kissin' babies I'm stealin' their lollipops. But! It also means I keep my options open."

> That solution implies we have to move toward zero-trust architectures even within open communities

Zero trust cannot exist as long as you interact with the real world. The problem wasn't trust per se, but blind trust.

The answer isn't to eschew trust (because you can't) but to organize it with social structures, like what people did with “chain of trust” certificates back then before it became commoditized by commercial providers and cloud giants.

> trying to legislate good use into licenses

Text files don't have power. Appealing to old power institutions to give them power is not the way to create new power either. Legacy systems with entrenched power have tended to insulate those at the top, killing social mobility and enabling those institutions to act against broad interests.

Open source has always been a force of social mobility. You could learn from reading high quality code. Anyone could provide service for a program. You could start a company not bound by bad decision makers who held the keys.

Open source always outmaneuvers inefficiency. Those who need to organize are not beholden to legacy systems. We need technically enabled solutions to organize and create effective decision making. The designs must preserve social mobility within to avoid becoming what they seek to replace. I'm building the technically enabled solutions for at https://positron.solutions

The Internet was the “Wild West”, and I mean that in the most kind, brutal, and honest way, both like a free fantasy (everyone has a website), genocide (replacement of real world), and an emerging dystopia (thieves/robbers, large companies, organizations, and governments doing terrible things).

It’s changing but not completely.

  • Which, if you think about it, is a mostly uplifting timeline.

    Back in 1770 there were basically 0 democracies on the planet. In 1790 there were 2. Now there are about 70 with about 35 more somewhere in between democracy and autocracy. So most of the world's population is living under a form of democracy. I know that things are degrading for many big democracies, but it wouldn't be the first time (the period between WW1 until the end of WW2 was a bad time for democracies).

    I have no idea how we get from here to a civilized internet, though.

""The "end of history" hangover is real. ""

This is the real issue. FOSS was born out of a utopian era in 60's-2000s' where the US was still a beacon of hope. That is fundamentally impossible in todays world of ultra-shark-world-eat-you capitalism and global race to the bottom.

If it didn't already exist, FOSS would not be able to get off the ground today. FOSS couldn't start and survive today. Its survival is in jeopardy.

  • FOSS was born because the cost of sharing information rapidly approached nothing. BBS and Usenet were loaded with shared software, simply because it was easy to share and there was incredible demand for it.

    FOSS doesn't need the US or 1980s counterculture to succeed. It just needs cheap disk space and someone willing to share their code. The price of storage and internet continues to fall, and I think FOSS will be fine as long as that continues.

    • Sure, and there will continue to be hackers that love programming and want to share.

      But the article was kind of about how to control access, the licensing, and bad actors. And that is out the window, anybody can steal your code regardless of the license, and North Korea can use it in missiles if they want, nothing can stop it if it is openly shared.

I don't get why you conflate privacy and resistance to censorship.

I think privacy is essential for freedom.

I'm also fine with lots of censorship, on publicly accessible websites.

I don't want my children watching beheading videos, or being exposed to extremists like (as an example of many) Andrew Tate. And people like Andrew Tate are actively pushed by YouTube, TikTok, etc. I don't want my children to be exposed to what I personally feel are extremist Christians in America, who infest children's channels.

I think anyone advocating against censorship is incredibly naive to how impossible it's become for parents. Right now it's a binary choice:

1. No internet for your children

2, Risk potential, massive, life-altering, harm as parental controls are useless, half-hearted or non-existent. Even someone like Sony or Apple make it almost impossible to have a choice in what your children can access. It's truly bewildering.

And I think you should have identify yourself. You should be liable for what you post to the internet, and if a company has published your material but doesn't know who you are, THEY should be liable for the material published.

Safe harbor laws and anonymous accounts should never have been allowed to co-exist. It should have been one or the other. It's a preposterous situation we're in.

  • Voluntary “censorship” (not being shown visceral media you don’t ask) and censorship for children are very important.

    Bad “censorship” is involuntarily denying or hiding from adults what they want to see. IMO, that power tends to get abused, so it should only be applied in specific, exceptional circumstances (and probably always temporarily, if only because information tends to leak, so there should be a longer fix that makes it unnecessary).

    I agree with you that children should be protected from beheading and extremism; also, you should be able to easily avoid that yourself. I disagree in that, IMO, anonymous accounts and “free” websites should exist and be accessible to adults. I believe that trusted locked-down websites should also exist, which require ID and block visceral media; and bypassing the ID requirement or filter (as a client) or not properly enforcing it (as a server operator) should be illegal. Granting children access to unlocked sites should also be illegal (like giving children alcohol, except parents are allowed to grant their own children access).

  • I thought it was easy: watch videos with your kid, don't allow them to doomscroll or be raised by the "featured"/"front page" algorithms.

    • You can't be with your child 100% of the time. They are spending significant time with others, e.g. in school. Those people you can't control.

      Doomscrolling or porn is just too "appealing" to children, like sugar. Children don't have their minds fully developed to be able to say "no" to them.

      If in school everybody has a smartphone and does doomscrolling, your children will do as well. Or they'll be ostracised.

      1 reply →

  • [flagged]

    • This is a horrible comment and is exactly what we're trying to avoid on HN. The guidelines make it clear we're trying for something better here. HN is only a place where people want to participate because enough people take care to make their contributions far more substantive than this. Please do your part by reading the guidelines and making an effort to observe them in future.

      These ones in particular are relevant:

      Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

      Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

      When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."

      Please don't fulminate. Please don't sneer, including at the rest of the community.

      Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.

      Eschew flamebait. Avoid generic tangents. Omit internet tropes.

      Please don't use Hacker News for political or ideological battle. It tramples curiosity.

      https://news.ycombinator.com/newsguidelines.html