Comment by strangescript

2 months ago

[flagged]

Yeah, this is why I'm having a hard time taking many programmers serious on this one.

As a general class of folks, programmers and technologists have been putting people out of work via automation since we existed. We justified it via many ways, but generally "if I can replace you with a small shell script, your job shouldn't exist anyways and you can do something more productive instead". These same programmers would look over the shoulder of "business process" and see how folks did their jobs - "stealing" the workflows and processes so they could be automated.

Now that programmers jobs are on the firing block all of a sudden automation is bad. It's hard to sort through genuine vs. self-serving concern here.

It's more or less a case of what comes around goes around to me so far.

I don't think LLMs are great or problem free - or even that the training data set scraped from the Internet is moral or not. I just find the reaction to be incredibly hypocritical.

Learn to prompt, I guess?

  • If we're talking the response from the OP, people of his caliber are not in any danger of being automated away, it was an entirely reasonable revulsion at an LLM in his inbox in a linguist skinsuit, a mockery of a thank-you email.

    I don't see the connection to handling the utilitarianism of implementing business logic. Would anyone find a thank-you email from an LLM to be of any non-negative value, no matter how specific or accurate in its acknowledgement it was? Isn't it beyond uncanny valley and into absurdism to have your calculator send you a Christmas card?

    • To be clear, my comment was in no way intended towards Rob Pike or anyone of his stature and contributions to the technology field.

      It was definitely a less-than-useful comment directed towards the tech bro types that came later when the money started getting good.

    • People of his caliber is not being automated away but people pay less attention to him and don’t worship him like before so he is butt hurt.

  • Are people here objecting to Gen AI being used to take their jobs? I mainly see people objecting to the social, legal, and environmental consequences.

    • What's the problem with that, anyway? I object to training a machine to take/change my job [building them, telling them what to do]. What's more, they want me to pay? Hah. This isn't a charity. I either strike fortune, retire while the getting is good, or simply work harder for nothing. Hmm. I think I'll keep not displacing people, actually. Myself included.

      To GP: not all of us who automate go for low hanging fruit, I guess.

      To the peer calling this illegitimate [or anyone, really]: without the assistance of an LLM, please break down the foul nature of... let me check my notes, gainful employment.

    • > Are people here objecting to Gen AI being used to take their jobs?

      Yes, even if they don't say it. The other objections largely come from the need to sound more legitimate.

      7 replies →

  • >programmers and technologists have been putting people out of work

    I think it's more causing people to do different work. There used to be about 75% of the workforce in agriculture but tractors and the like reduced that to 2% or so. I'm not sure if the people working as programers would be better off if that didn't happen and they were digging potatoes.

  • I wouldn't be angry if current AI _only_ automated programmers/software engineers. I'd be worried and stressed out, but not angry.

    But it also automates _everything else_. Art and self-expression, most especially. And it did so in a way that is really fucking disgusting.

    • Well put, it's not the automation of programming that bothers me, it's the automation of what it means to be human.

  • > Now that programmers jobs are on the firing block all of a sudden automation is bad. It's hard to sort through genuine vs. self-serving concern here.

    The concern is bigger than developer jobs being automated. The stated goal of the tech oligarchs is to create AGI so most labor is no longer needed, while CEOs and board members of major companies get unimaginably wealthy. And their digital gods allow them to carve up nations into fiefdoms for the coming techno fascist societies they envision.

    I want no part of that.

I think there is a difference between automating “things” (as you put it) and getting to the point where people are on stage suggesting that the government becomes a “backstop” to their investments in automation.

I can imagine AI being just as useless in 100 years at creating real value that their parent companies have to resort to circular deals to pump up their stock.

[flagged]

  • I always wonder if the general sentiment toward genai would be positive if we had wealth redistribution mechanisms in place, so everyone would benefit. Obviously that's not the case, but if you consider the theoretical, do you think your view would be different?

    • To be honest, I'm not even sure I'm fully on board with the labor theft argument. But I certainly don't think generative AI is such an unambiguous boon for humankind that we should ignore any possible negative externalities just to advance it.

  • > "To someone who believes that AI training data is built on the theft of people's labor..."

    i.e. people who are not hackers. Many (most?) hackers have been against the idea of copyright and intellectual property from the beginning. "Information wants to be free." after all.

    Must be galling for people to find themselves on the same side as Bill Gates and his Open Letter to Hobbyists in 1976 which was also about "theft of people's labor".

  • [flagged]

    • > The difference is that people who write open source code or release art publicly on the internet from their comfortable air conditioned offices voluntarily chose to give away their work for free

      That is not nearly the extent of AI training data (e.g. OpenAI training its image models on Studio Ghibli art). But if by "gave their work away for free" you mean "allowed others to make [proprietary] derivative works", then that is in many cases simply not true (e.g. GPL software, or artists who publish work protected by copyright).

    • What? Over 183K books were pirated by these big tech companies to train their models. They knew what they were doing was wrong.

  • > believes that AI training data is built on the theft of people's labor

    I mean, this is an ideological point. It's not based in reason, won't be changed by reason, and is really only a signal to end the engagement with the other party. There's no way to address the point other than agreeing with them, which doesn't make for much of a debate.

    > an 1800s plantation owner saying "can you imagine trying to explain to someone 100 years from now we tried to stop slavery because of civil rights"

    I understand this is just an analogy, but for others: people who genuinely compare AI training data to slavery will have their opinions discarded immediately.

    • We have clear evidence that millions of copyrighted books have been used as training data because LLMs can reproduce sections from them verbatim (and emails from employees literally admitting to scraping the data). We have evidence of LLMs reproducing code from github that was never ever released with a license that would permit their use. We know this is illegal. What about any of this is ideological and unreasonable? It's a CRYSTAL CLEAR violation of the law and everyone just shrugs it off because technology or some shit.

      18 replies →

    • What makes something more or less ideological for you in this context? Is "reason" always opposed to ideology for you? What is the ideology at play here for the critics?

    • > I mean, this is an ideological point. It's not based in reason

      You cant be serious

And environmental damage. And damage to our society. Though nobody here tried to stop LLMs. The genie is out of the bottle. You can still hate it. And of course enact legislation to reduce harm.

When I read your comment, I was “trained” on it too. My neurons were permanently modified by it. I can recall it, to some degree, for some time. Do I necessarily owe you money?

  • You do owe money for reusing some things that you read, and not for others. Intellectual property exists.