← Back to context

Comment by mattsoldo

6 days ago

It's never OK to physically attack someone like this. Full stop.

Separately; Sam's belief that "AI has to be democratized; power cannot be too concentrated." rings incredibly hollow. OpenAI has abandoned its open source roots. It is concentrating wealth - and thus power - into fewer hands. Not more.

If only that sentiment was reciprocal!

When the job losses hit in earnest and the vague handwaving about making it right all inevitably turns out to be hollow, those on top will be exceedingly comfortable using violence to keep the underclass in line. It has happened before and it will happen again.

  • My assumption based on many factors is that it is precisely why the carpet surveillance systems like Flock are being rolled out in preparation.

    There are people in control who don’t make 1, 5, or 10 year plans; they make 20, 50, 100, and 500 year plans; and they know human nature quite well, which allows them to of not predict, have an anxious understanding for what their plans will cause and what needs to be prepared for in advance.

    • The flock systems are being installed by cities not the feds. You make it seem like someone has some master plan. Does not make flock any less dangerous but its not as organized as you make it seem.

      3 replies →

Sam eagerly pursued DoD contracts to weaponize AI. And then lobbied for legislation to ensure OpenAI cannot be held accountable if people are killed due to their systems.

  • I find it interesting that Altman's fans seem to keep skipping past this fact. I'd love to hear their defense as to why one person potentially being responsible for hundreds or thousands of deaths is acceptable, but attacking that one person isn't. If violence is never the answer, they should be condemning Altman with even more vigor.

    • > why one person potentially being responsible for hundreds or thousands of deaths is acceptable

      I am not sure who exactly is that one person ? Is it Altman, who is according to many people not that knowledgeable in AI in the first place; the scientist who found a breakthrough (who is it ?); is it the president of the United States who is greenlighting the strikes; the general who is choosing the target (based on AI suggestions); the missile designer; the manufacturer; the pilot who flew the plane ?

      I get the point of concentrating power in fewer hands, but the whole "all the problems of this world are caused by an extremely narrow set of individuals" always irks me. Going as far as saying there is just one is even mor ludicrous.

      5 replies →

    • The entire purpose of government is to have a monopoly on violence. Democracies give their government the power to decide when and against whom to deploy violence.

      There is a real difference between giving a democratic government the tools to kill people vs attempting to kill people yourself. If you don’t believe this then you don’t believe in democracy.

      22 replies →

  • There's thirty-some-odd million people in Ukraine who very much would like to get AI weapons before the Russians do. They're coming whether you want them or not.

The thing about the rich is that they have access to sufficient levels of abstraction that they can commit terrible, disproportionate violence without it looking that way. And then fools who crave the simplistic safe comfort of moral absolutes come to their aid.

Throwing a petrol bomb at a building with children inside is about as evil as murdering 150 students at an all-girls school. I'm obviously not defending that.

  • > Throwing a petrol bomb at a building with children inside is about as evil as murdering 150 students at an all-girls school. I'm obviously not defending that.

    Really? I don’t know how many were in his house but at most it’s attempted murder of a few versus killing 150.

    I see a difference.

    US law sees a difference too. The person that threw the firebomb will get the full weight of the law if they are caught, and spent an awfully long time in prison.

    Those that killed the school girls will never face punishment.

    • If you want to draw that distinction, then don't you need to account for intent? I don't think the USG intended to bomb a school. The guy throwing a Molotov cocktail has even less claim to it being an accident.

      3 replies →

> Separately; Sam's belief that "AI has to be democratized; power cannot be too concentrated." rings incredibly hollow. OpenAI has abandoned its open source roots. It is concentrating wealth - and thus power - into fewer hands. Not more.

We should call it what it really is: oligapolization of intellectual work. The capital barrier to enter this market is too high and there can be no credible open source option to prevent a handful of companies from controlling a monster share of intellectual work in the short and medium term. Yet our profession just keeps rushing head first into this one-way door.

>> It will not all go well. The fear and anxiety about AI is justified; we are in the process of witnessing the largest change to society in a long time, and perhaps ever. We have to get safety right, which is not just about aligning a model

The question is what are they doing about "getting safety right" and are they doing enough. To me it seems like all the focus is on hyper growth, maximum adaptation and safety is just afterthought. I understand its competitive market, and everyone is doing it, but its just hollow words. Industries that cares about safety often tend to slow down.

  • I told my GF over dinner tonight that historians in 1000 years will look back to Nov 2023 as a pivotal fork where humans lost.

    Without missing a beat, she said " If humans loss was that complete, there would be no historians.

    I responded that I never said they were human historians.

    • > I told my GF over dinner tonight that historians in 1000 years will look back to Nov 2023 as a pivotal fork where humans lost.

      Yes, because no one listened to me. It was early-mid 2024, and here as well as on other places, people kept saying "oh well the cat's out of the bag now, nothing can be done, it can't be stopped". I pointed out that only 4 or so planes being made to collide with TSMC, NVIDIA and ASML would be enough to give at least a decade of breathing room while we try to figure out how to keep this technology safe. I'm almost certain there were people who read it on here as well as elsewhere who could have made it happen.

      _Now_ it is indeed too late.

Is it okay to profit off of a machine that kills innocent people? Would it be immoral to attack the builder of that machine, if it stopped the operation of the machine?

  • Oh, come on, be serious: if that’s the argument then why start with Sam Altman?

    If you want to hold the leader of a contemporary tech giant responsible for causing excess deaths then Meta and Zuckerberg would be a lot higher up the list - maybe even at the very top.

    Now I despise Mark Zuckerberg, but I don’t want to firebomb his house: I want his company neutered and/or broken up, I want him stripped of his ill-gotten wealth, and ideally I want him to face criminal prosecution and incarceration.

    But the point is this: whoever firebombed Sam Altman’s house didn’t do it out of a principled stance - in fact I suspect they barely expended any thought on the matter - because if they were really acting out of principle they’d have chosen a different target, they’d have done some research into who is trying to expose and bring down that target, and they’d have figured out how they could help rather than just randomly engage in violence. Whereas this was just a dangerous stunt.

    • They could have chosen the target that was most available to them. Or they could feel particularly wronged by Sam Altman. Maybe they have Iranian friends.

    • > why start with Sam Altman?

      Well Zuck has that big scary hedge, and I’m sure people have been going after him for ages.

      > I despise Mark Zuckerberg, but I don’t want to firebomb his house: I want his company neutered and/or broken up, I want him stripped of his ill-gotten wealth, and ideally I want him to face criminal prosecution and incarceration.

      Great! Is the plan to wait until after the billionaires have their AI controlled military drone swarms to have this revolution? Because they already control your government - I don’t think you will achieve anything like this through legal means

      1 reply →

    • This has already been a movie called Terminator 2: Judgment Day. Sarah Connor is out to kill Dyson to stop Skynet from becoming a thing and the audience watched it thinking she was probably justified but was uncomfortable anyway. Spoiler alert: she ended up shooting but not killing him.

      My point is, we've seen this movie and killing Sam Altman is uncomfortable but justified.

  • I'm on the skeptic side of "AI" and find this entire industry obnoxious, but your argument doesn't hold any water.

    Technology that can be used to kill innocent people is all around us. Would it be moral to attack knife manufacturers? Attacking one won't make the technology disappear. It has been invented, so we have to live with it.

    Also, it's a stretch to say that "AI" "kills innocent people". In the hands of malicious people it can certainly do harm, but even in extreme cases, "AI" can currently only be used very indirectly to actually kill someone.

    Technology itself is inert. What humans do with technology should be regulated.

    IMO the fabricated concern around this tech is just part of the hype cycle. There's nothing inherently dangerous about a probabilistic pattern generator. We haven't actually invented artificial intelligence, despite of how it's marketed. What we do need to focus on is educating people to better understand this tech and use it safely, on restricting access to it so that we can mitigate abuse and avoid flooding our communication channels with garbage, and on better detection and mitigation technology to flag and filter it when it is abused. Everything else is marketing hype and isn't worth paying attention to.

    • > Would it be moral to attack knife manufacturers?

      Apply this to guns.

      Then look how this works in the US. You could, but then a law was made to protect gun manufacturers, The Protection of Lawful Commerce in Arms Act.

      AI will get this treatment I’m sure.

    • >Would it be moral to attack knife manufacturers?

      if they're selling the knives knowingly to a knife-murderer, it might be worth discussing.

      Sam Altman is not, although he portrays himself that way, some geeky guy without power who just builds products, he's the guy who makes the decision to supply this tech directly to the US government who is on the record about using it for military operations. And you're right on the last point. Sure the 20 year old guy who threw a molotov cocktail at Sam's house is, I'm going to assume for now given the topic Sam chose for the piece, an anti-tech guy.

      But assume for a second you had your family wiped out in a bombing run because Pete Hegseth attempted to prompt himself to victory with the statistical lottery machine. If the CEO knew this and enabled it to add another zero to his bank account, not so sure about the ethics of that one.

    • Sibling comment already said it, but yes I was specifically alluding to Altman's decision to allow the US government to use their AI to choose bombing targets without a human in the loop - perhaps this is why the US government double-tapped[1] a school killing 160 girls, all younger than 12, when the school was clearly marked on google maps.

      I also vigorously dislike the industry, but your stance 'I'm on the skeptic side of "AI"' is something you need to address - saying this in the friendliest way possible, you are wrong.

      AI needs to be opposed, because the billionaires are going to use it to turn the world into shit, but if the best the AI opposition can muster is "AI isn't useful", we are fucked. It's extremely powerful and can do bizzaro things when you rig it up with tools - the kinds of things we need to prevent companies like Google from doing with it, no one is paying attention to.

      [1] double-tapped: a phrase referring to the practice of firing a second missile after the first to kill any rescuers or surviving schoolgirls

      7 replies →

I didn't think Hacker News needed an explicit "calls for violence are bad" guideline but the comments here have shown otherwise.

  • It would be extremely difficult to have politics discussion without condoning violence. Deciding what sorts of violence is ok is an inherent part of politics. In practice, there's no way to ban calls for violence without banning the discussion of wide swaths of political topics.

  • Do you feel the same way about comments that support the US military action in Iran? Why or why not?

    • It is unnecessary, and it was an obvious offense, not defense. Of course it is "bad". We (Trump) need(s) to stop creating wars and fucking up the economy, while killing others. It is bad all the way down.

      2 replies →

  • I agree with the idea that calls for violence are bad; however most people in the world are more than happy to support both violence and calls for same against people and organizations they believe to be sufficiently significant threats.

    Are calls for violence against Hitler during WW2 bad? How about the Japanese imperial navy?

    How about calls for violence against Putin during his war of aggression?

    This isn’t rhetoric; I’m just pointing out that it isn’t as black and white as people seem to make it. (It is black and white for me, as I’m with Asimov on the matter, but it isn’t for most humans.)

  • If you can't think of a single occurrence in history that directly disproves your proposed guideline, it's time to drop whatever you're doing and study history.

    If you can think of one, then you shouldn't be proposing introduction of guidelines that are blatantly false. Or would you like a "1+1 is not 2" guideline to accompany it?

  • Are calls for violence bad when you're calling for throwing a molotov cocktail at a child? At an adult? At a serial killer? At someone who's about to shoot you unprovoked? At someone who murdered your family? At someone who's about to?

    If you said "yes" to all of the above, I'd love to know your reasoning.

If we are going to say violence isn’t okay then it is important that we be clear about the boundaries of what we define as violence.

Theft is a nice analogy here. The default model of theft is property crime but the largest type of theft is wage theft.

If we fret about violence done against individuals but not violence against groups our attention is going to end up steered in a narrow direction.

That's not true.

As a defense contractor Altman is a legitimate target for a country that the US has attacked like Iran.

The US is engaging in military action against many countries and has threatened to annex or invade allies.

In that context Altman is 100% a legitimate target to those whose sovereignty is threatened and whose people are being killed.

I categorically reject that assertion. Two simple examples: 1) when you see someone assaulting someone else, it's absolutely ok to attack them, and 2) the American revolution!

It's like that old joke:

A man offers a young woman $1,000,000 to sleep with him for one night.

“For a million dollars? Sure, I’ll sleep with you.”

He smiles at her, “How about $50, then?”

“How dare you! I’m not a whore!”

“Look, lady, we’ve already agreed what you are, now we’re just negotiating the price.”

Similarly in this case, you can't make up absolutes and assert the're true, while ignoring that the real world is more complicated. And once you do realize the world is complicated, you realize there aren't absolutes: everyone is a prostitute, terrorist, or whatever other bad label you want to throw at them ... it's just a matter of degree.

So no, it's not always wrong to physically attack someone like this. You can debate specifically whether Altman has committed enough violence himself to justify violence against him: that's something two people can reasonably disagree on. But you can't just say "violence bad" like its some great pearl of wisdom, while ignoring that violence has in fact been good many times throughout history.

> OpenAI has abandoned its open source roots.

It was only a matter of time. The font on the dollar sign kept increasing, eventually selfish humans will always crack. Keeping it open had to be instilled with it becoming a public utility. Private companies don't do altruistic things unless they benefit.

He's saying that just so he can use if another company gets bigger than OpenAI ("you can't have all the power"). If OpenAI were the top dog by a large margin, you wouldn't hear him say a peep about this (as was demonstrated by his actions with the charter).

Violence is language that needs no translation. Everyone across the world, every culture, every country, every social group - from elites to homeless can converse in it using the same vocabulary.

It is useful to have some degree of mastery in this discipline. Sometimes it is the only language that can deliver the important message to an unwilling listener.

‘Working towards prosperity for everyone’ was extremely hollow as well. If he believed this, he would be running his company as a cooperative and not as a for-profit company.

Agreed. Sam's full of crap and the way we tackle that is with conversations, not violence. He deserves to grow old like anyone else, violence isn't an answer.

  • I don't condone violence, but the contract he's signed with the US military is a credible threat to everyone in the US. OpenAI will now certainly be called on to assist in domestic mass surveillance, under threat of the kind of severe penalties Anthropic has faced. So why did he agree to that contract, unless he's will to provide that assistance? So it's gone well beyond conversation, though not to a point where violence is appropriate. Boycotts and hostility are definitely appropriate at this point IMO, though.

  • He isn't going to suddenly grow a conscience from a riveting, intellectually stimulating conversation.

  • > the way we tackle that is with conversations, not violence

    I think the breakdown here is that conversation seems to have no power. To only be a bit hyperbolic, the only language with power is money -- or violence. To the extent that ordinary people cannot make change with "conversation" (which I interpret here to mean dialog within society, including with lawmakers), they feel compelled to use violence instead.

    A non-rhetorical question: What recourse to non-billionaires have when conversation has less and less power, while money has more and more, and those with money are making much more money?

    • There's still a meaningful difference between violence wielded by a single individual who feels angry or unheard, and violence wielded by a large representative group who has invested genuine effort in conversation before collectively deciding violence is required.

      3 replies →

  • It's pretty amazing to observe people experience the past ten years in American history and continue to think that we can out-talk the bad people in the world.

    Michelle Obama's, "When they go low, we go high", is some of the stupidest political advice and a generation has lost so much because of it. (The generation before got West Winged into believing the same thing.)

    When you look to the right, you have a stolen election in 2000, a stolen supreme court seat, an attempted coup, and relentless winning despite it.

    • I don't think street violence solves anything. I don't think Michelle was right, sometimes you have to fight fire with fire, but you don't fight words with literal firebombs.

    • This may come right when Americans see themselves backsliding relative to other power blocks, and allies turning away. It’s started.

      But it seems a distant hope at best.

  • That sentiment always comes from people who are better at fighting with communication.

AGI will be democratized when its discovered.... just right after AWS, Microsoft and Oracle finish their 6 month beta test.

> It's never OK to physically attack someone like this. Full stop.

I agree. The French Revolution was really, really mean.

  • Are you familiar with the details of the French Revolution? Some of the eventual outcomes were indeed positive, but a lot of what actually went on was pretty horrific.

    • It was horrific. Revolutions tend to be. Yet our institutions continue consolidating money and power in fewer and fewer hands. If that doesn't stop, we'll be headed there again. It will probably be even worse this time.

    • A lot of what happened during the French revolution was horrific... This is such a bewildering sentence in this context. Yes, killing the rulers is horrific. Revolutions are horrific. Wars are horrific. It seems irrelevant to what the parent is (sarcastically) saying.

      3 replies →

    • At the same time considering the people participating, there wasn't a way out of the problems that didn't involve violence. Different outcomes would require different choices that require different people.

    • what are you arguing? that people should not violently overthrow their corrupt leaders? that the french should've let the Ancient Regime entrench and continue? That the serfs (slaves) in tsarist Russia should've stayed put and not revolt against the corrupt and incompetent Nicholas II? Or that the Hungarians and Czechoslovaks not revolt against the totalitarian regimes propped by the Russians? Should've the Romanians in 1989 stayed at home, in cold and hunger, and let Ceausescu regime continue to cruelly oppress them?

    • You think the cyberpunk dystopia we're headed towards isn't going to be horrific? The one where 99% of the human race has no economic value? Where the 1% helm megagigaultracorporations with fully autonomous AI powered kill bots? Where they think it's no big loss if they genocide an entire human population because all those people were doing nothing but costing them money anyway?

      This is our only chance to transition to a post-scarcity society. We won't have another. Allowing them to monopolize access to AI is a fatal mistake.

      4 replies →

  • The French Revolution brought on Napoleon, wars that brought about the deaths of many millions of people, and then another emperor. The subsequent events are where they found liberty.

If Sam disperses his power, we can believe him. So long as he's just concentrating wealth and power, he's just another tech bro.

An oligarch who promotes “democracy”. Is trying to cynically ingratiate himself, or is he really that deaf to the irony?

> It's never OK to physically attack someone like this.

I broadly agree. But… there are some who have lived who made the world a worse place. Who gets to decide? Trump has done a bit of this Sort of deciding and it hasn’t gone great so far and there is no sign that it’s actually helped.

Can't say I feel sorry for the guy. Anyone who actually believes his platitudes about "democratizing" AI is far too naive. If he really believed that, he'd make a torrent out of ChatGPT's weights and upload it to the pirate bay.

The fact of the matter is these AI CEOs are actively trying to economically disenfranchise 99% of the human race. The ultimate corollary of capitalism is that people who aren't economically productive need not be kept alive any longer. Unproductive people are nothing but cost, better to just let them die. A future where the richest classes can turn the underclasses into soylent is now very much within the realm of possibility.

If this doesn't radicalize people into actual violence, I simply have no idea what will. "Attacking someone is wrong" is a completely meaningless statement to make to someone who believes society as we know it today is going to be destroyed. Honestly, I can't even blame them.

> AI has to be democratized; power cannot be too concentrated

That sounds like something someone says when he understands his weak position, especially someone as ruthless, dishonest, and narcissistic as Altman.

[flagged]

  • The idea that firing you or stealing your wages is the worst a CEO can do to you is itself a product of the taboo against physical violence. There are a number of famous incidents from the late 1800s and early 1900s, when the taboo was weaker, of CEOs sending private armies to shoot inconvenient labor movements. It's not an equilibrium you should defect from lightly.

    • A CEO can choose physical, mental, legal or financial violence against the common man. The common man only has the choice of physical violence. Without it he is impotent.

      2 replies →

  • [flagged]

    • Change and progress like the people of France deciding they had enough of injustice and nobles' impunity, then? A little short-term pain for social progress? We agree.

      2 replies →

    • That sounds suspiciously like a "ends justify the means" argument.

      It's easy to say we need to be willing to accept short term pains when it's someone else who has to bear the brunt of them.

[flagged]

Well said, I condemn the violence as well. I had to stop at that point too though, it's so blatantly disingenuous and hypocritical.

it isn’t ok to attack people.

whether this way or in slow motion mass attacks on people.

an attack on a society that lasts years is still an attack and i wish the collective we would realize this.

“it’s ok if millions suffer now for me to realize my dream” is just wrong.

i’ll never understand how these guys fail to realize: they actively push for people not to care about the destruction they cause. that’s obviously going to bite them in the ass whenever they’re on the receiving end.