Comment by dakolli
17 hours ago
Their goal is to monopolize labor for anything that has to do with i/o on a computer, which is way more than SWE. Its simple, this technology literally cannot create new jobs it simply can cause one engineer (or any worker whos job has to do with computer i/o) to do the work of 3, therefore allowing you to replace workers (and overwork the ones you keep). Companies don't need "more work" half the "features"/"products" that companies produce is already just extra. They can get rid of 1/3-2/3s of their labor and make the same amount of money, why wouldn't they.
ZeroHedge on twitter said the following:
"According to the market, AI will disrupt everything... except labor, which magically will be just fine after millions are laid off."
Its also worth noting that if you can create a business with an LLM, so can everyone else. And sadly everyone has the same ideas, everyone ends up working on the same things causing competition to push margins to nothing. There's nothing special about building with LLMs as anyone can just copy you that has access to the same models and basic thought processes.
This is basic economics. If everyone had an oil well on their property that was affordable to operate the price of oil would be more akin to the price of water.
EDIT: Since people are focusing on my water analogy I mean:
If everyone has easy access to the same powerful LLMs that would just drive down the value you can contribute to the economy to next to nothing. For this reason I don't even think powerful and efficient open source models, which is usually the next counter argument people make, are necessarily a good thing. It strips people of the opportunity for social mobility through meritocratic systems. Just like how your water well isn't going to make your rich or allow you to climb a social ladder, because everyone already has water.
> Its also worth noting that if you can create a business with an LLM, so can everyone else. And sadly everyone has the same ideas
Yeah, this is quite thought provoking. If computer code written by LLMs is a commodity, what new businesses does that enable? What can we do cheaply we couldn't do before?
One obvious answer is we can make a lot more custom stuff. Like, why buy Windows and Office when I can just ask claude to write me my own versions instead? Why run a commodity operating system on kiosks? We can make so many more one-off pieces of software.
The fact software has been so expensive to write over the last few decades has forced software developers to think a lot about how to collaborate. We reuse code as much as we can - in shared libraries, common operating systems & APIs, cloud services (eg AWS) and so on. And these solutions all come with downsides - like supply chain attacks, subscription fees and service outages. LLMs can let every project invent its own tree of dependencies. Which is equal parts great and terrifying.
There's that old line that businesses should "commoditise their compliment". If you're amazon, you want package delivery services to be cheap and competitive. If software is the commodity, what is the bespoke value-added service that can sit on top of all that?
We said the same thing when 3D printing came out. Any sort of cool tech, we think everybody’s going to do it. Most people are not capable of doing it. in college everybody was going to be an engineer and then they drop out after the first intro to physics or calculus class. A bunch of my non tech friends were vibe coding some tools with replit and lovable and I looked at their stuff and yeah it was neat but it wasn't gonna go anywhere and if it did go somewhere, they would need to find somebody who actually knows what they're doing. To actually execute on these things takes a different kind of thinking. Unless we get to the stage where it's just like magic genie, lol. Maybe then everybody’s going to vibe their own software.
I don't think claude code is like 3d printing.
The difference is that 3D printing still requires someone, somewhere to do the mechanical design work. It democratises printing but it doesn't democratise invention. I can't use words to ask a 3d printer to make something. You can't really do that with claude code yet either. But every few months it gets better at this.
The question is: How good will claude get at turning open-ended problem statements into useful software? Right now a skilled human + computer combo is the most efficient way to write a lot of software. Left on its own, claude will make mistakes and suffer from a slow accumulation of bad architectural decisions. But, will that remain the case indefinitely? I'm not convinced.
This pattern has already played out in chess and go. For a few years, a skilled Go player working in collaboration with a go AI could outcompete both computers and humans at go. But that era didn't last. Now computers can play Go at superhuman levels. Our skills are no longer required. I predict programming will follow the same trajectory.
There are already some companies using fine tuned AI models for "red team" infosec audits. Apparently they're already pretty good at finding a lot of creative bugs that humans miss. (And apparently they find an extraordinary number of security bugs in code written by AI models). It seems like a pretty obvious leap to imagine claude code implementing something similar before long. Then claude will be able to do security audits on its own output. Throw that in a reinforcement learning loop, and claude will probably become better at producing secure code than I am.
17 replies →
Its not our current location, but our trajectory that is scary.
The walls and plateaus that have been consistently pulled out from "comments of reassurance" have not materialized. If this pace holds for another year and a half, things are going to be very different. And the pipeline is absolutely overflowing with specialized compute coming online by the gigawatt for the foreseeable future.
So far the most accurate predictions in the AI space have been from the most optimistic forecasters.
2 replies →
You can basically hand it a design, one that might take a FE engineer anywhere from a day to a week to complete and Codex/Claude will basically have it coded up in 30 seconds. It might need some tweaks, but it's 80% complete with that first try. Like I remember stumbling over graphing and charting libraries, it could take weeks to become familiar with all the different components and APIs, but seemingly you can now just tell Codex to use this data and use this charting library and it'll make it. All you have to do is look at the code. Things have certainly changed.
27 replies →
Thank you for posting this.
Im really tired, and exhausted of reading simple takes.
Grok is a very capable LLM that can produce decent videos. Why are most garbage? Because NOT EVERYONE HAS THE SKILL NOR THE WILL TO DO IT WELL!
6 replies →
> To actually execute on these things takes a different kind of thinking
Agreed. Honestly, and I hate to use the tired phrase, but some people are literally just built different. Those who'd be entrepreneurs would have been so in any time period with any technology.
This goes well along with all my non-tech and even tech co-workers. Honestly the value generation leverage I have now is 10x or more then it was before compared to other people.
HN is a echo chamber of a very small sub group. The majority of people can’t utilize it and needs to have this further dumbed down and specialized.
That’s why marketing and conversion rate optimization works, its not all about the technical stuff, its about knowing what people need.
For funded VC companies often the game was not much different, it was just part of the expenses, sometimes a lot sometimes a smaller part. But eventually you could just buy the software you need, but that didn’t guarantee success. Their were dramatic failures and outstanding successes, and I wish it wouldn’t but most of the time the codebase was not the deciding factor. (Sometimes it was, airtable, twitch etc, bless the engineers, but I don’t believe AI would have solved these problems)
4 replies →
3 things
1) I don’t disagree with the spirit of your argument
2) 3D printing has higher startup costs than code (you need to buy the damn printer)
3) YOU are making a distinction when it comes to vibe coding from non-tech people. The way these tools are being sold, the way investments are being made, is based on non-domain people developing domain specific taste.
This last part “reasonable” argument ends up serving as a bait and switch, shielding these investments. I might be wrong, but your comment doesn’t indicate that you believe the hype.
This whole comment thread here is really echoing and adding to some thoughts ive had lately on the shift from considering LLMs replacing engineering to make software (much of which is about integration, longevity and customization of a general system), vs LLMs replacing buying software.
If most software is just used by me to do a specific task, then being able to make software for me to do that task will become the norm. Following that thought, we are going to see a drastic reduction in SASS solutions, as many people who were buying a flexible-toolbox for one usecase to use occasionally, just get an llm to make them the script/software to do that task as and when they need it, without any concern for things like security, longevity, ease of use by others (for better or for worse).
I guess what im circling around is that if we define engineering as building the complex tools that have to interact with many other systems, persist, be generally useful and understandable to many people, and we consider that many people actually dont need that complexity for their use of the system, the complexity arises from it needing to serve its purpose at huge scale over time. then maybe there will be less need for enginners, but perhaps first and foremost because the problems that engineering is required to solve are much less if much more focused and bespoke solutions to peoples problems are available on demand.
As an engineer i have often felt threatened by LLMs and agents of late, but i find that if i reframe it from Agents replacing me, to Agents causing the type of problems that are even valuable to solve to shift, it feels less threatening for some reason. Ill have to mull more.
Taking it further, imagine a traditional desktop OS but it generates your programs on the fly.
Google's weird AI browser project is kind of a step in this direction. Instead of starting with a list of programs and services and customizing your work to that workflow, you start with the task you need accomplished and the operating system creates an optimized UI flow specifically for that task.
Even if code gets cheaper, running your own versions of things comes with significant downsides.
Software exists as part of an ecosystem of related software, human communities, companies etc. Software benefits from network effects both at development time and at runtime.
With full custom software, you users / customers won't be experienced with it. AI won't automatically know all about it, or be able to diagnose errors without detailed inspection. You can't name drop it. You don't benefit from shared effort by the community / vendors. Support is more difficult.
We are also likely to see "the bar" for what constitutes good software raise over time.
All the big software companies are in a position to direct enormous token flows into their flagship products, and they have every incentive to get really good at scaling that.
The logical endgame (which I do not think we will necessarily reach) would be the end of software development as a career in itself.
Instead software development would just become a tool anybody could use in their own specific domain. For instance if a manager needs some employee scheduling software, they would simply describe their exact needs and have software customized exactly to their needs, with a UI that fits their preference, ready to go in no time, instead of finding some SaaS that probably doesn't fit exactly what they want, learning how to use it, jumping through a million hoops, dealing with updates you don't like, and then paying a perpetual rent on top of all of this.
Writing the code has never been the hard part for the vast majority of businesses. It's become an order of magnitude cheaper, and that WILL have effects. Businesses that are selling crud apps will falter.
But your hypothetical manager who needs employee scheduling software isn't paying for the coding, they're paying for someone to _figure out_ their exact needs, and with a UI that fits their preference, ready to go in no time.
I've thought a lot about this and I don't think it'll be the death of SaaS. I don't think it's the death of a software engineer either — but a major transformation of the role and the death if your career _if you do not adapt_, and fast.
Agentic coding makes software cheap, and will commoditize a large swath of SaaS that exists primarily because software used to be expensive to build and maintain. Low-value SaaS dies. High-value SaaS survives based on domain expertise, integrations, and distribution. Regulations adapt. Internal tools proliferate.
Software isn't just the code, it's also the stability that can only be gained after years of successful operation and ironing out bugs, the understanding of who your customers truly are, what are their actual needs (and not perceived needs), which features will drive growth. etc. I think there's still a "there" there.
I think the kind of software that everybody needs (think Slack or Jira) is at the greatest risk, as everybody will want to compete in those fields, which will drive margins to 0 (and that's a good thing for customers)! However, I think small businesses pandering to specific user groups will still be viable.
> One obvious answer is we can make a lot more custom stuff. Like, why buy Windows and Office when I can just ask claude to write me my own versions instead? Why run a commodity operating system on kiosks? We can make so many more one-off pieces of software
yes, it will enable a lot of custom one-off software but I think people are forgetting the advantages of multiple copied instances, which is what enabled software to be so successful in the first place.
Mass production of the same piece of software creates standards, every word processor uses the same format and displays it the same way.
Every date library you import will calculate two months from now the same way, therefore this is code you don't have to constantly double check in your debug sessions.
This reminds me of the old idea of the Lisp curse. The claim was that Lisp, with the power of homoiconic macros, would magnify the effectiveness of one strong engineer so much that they could build everything custom, ignoring prior art.
They would get amazing amounts done, but no one else could understand the internals because they were so uniquely shaped by the inner nuances of one mind.
> why buy Windows and Office when I can just ask claude to write me my own versions instead? Why run a commodity operating system on kiosks?
Linux costs $0. Creating a linux clone compatible with your hardware from the hardware spec sheets with an AI for complicated hardware would cost thousands to millions of dollars in tokens, and you'd end up with something that works worse than linux (or more likely something that doesn't even boot).
Even if the price falls by a thousand fold, why would you spend thousands of dollars on tokens to develop an OS when there's already one you can use?
Even if software becomes cheaper to write, it's not free, and there's a lot of software (especially libraries) out there which is free.
> cost thousands to millions of dollars in tokens
> Even if the price falls by a thousand fold, why would you spend thousands of dollars on tokens to develop an OS when there's already one you can use?
Why do you assume token price will only fall a thousand fold? I'm pretty sure tokens have fallen by more than that in the last few years already - at least if we're speaking about like-for-like intelligence.
I suspect AI token costs will fall exponentially over the next decade or two. Like Dennard scaling / Moore's law has for CPUs over the last 40 years. Especially given the amount of investment being poured into LLMs at the moment. Essentially the entire computing hardware industry is retooling to manufacture AI clusters.
If it costs you $1-$10 in tokens to get the AI to make a bespoke operating system for your embedded hardware, people will absolutely do it. Especially if it frees them up from supply chain attacks. Linux is free, but linux isn't well optimized for embedded systems. I think my electric piano runs linux internally. It takes 10 seconds to boot. Boo to that.
> If software is the commodity, what is the bespoke value-added service that can sit on top of all that?
Troubleshooting and fixing the big mess that nobody fully understands when it eventually falls over?
> Troubleshooting and fixing the big mess that nobody fully understands
If that's actually the future of humans in software engineering then that sounds like a nightmare career that I want no part of. Just the same as I don't want anything to do with the gigantic mess of Cobal and Java powering legacy systems today.
And I also push back on the idea that llms can't troubleshoot and fix things, and therefore will eventually require humans again. My experience has been the opposite. I've found that llms are even better at troubleshooting and fixing an existing code base than they are at writing greenfield code from scratch.
1 reply →
> If software is the commodity, what is the bespoke value-added service that can sit on top of all that?
It would be cool if I can brew hardware at home by getting AI to design and 3D print circuit boards with bespoke software. Alas, we are constrained by physics. At the moment.
> If software is the commodity, what is the bespoke value-added service that can sit on top of all that?
Aggregation. Platforms that provide visibility, influence, reach.
> Yeah, this is quite thought provoking. If computer code written by LLMs is a commodity, what new businesses does that enable? What can we do cheaply we couldn't do before?
The model owner can just withhold access and build all the businesses themselves.
Financial capital used to need labor capital. It doesn't anymore.
We're entering into scary territory. I would feel much better if this were all open source, but of course it isn't.
I think this risk is much lower in a world where there are lots of different model owners competing with each other, which is how it appears to be playing out.
2 replies →
Why would the model owner do that? You still need some human input to operate the business, so it would be terribly impractical to try to run all the businesses. Better to sell the model to everyone else, since everyone will need it.
The only existential threat to the model owner is everyone being a model owner, and I suspect that's the main reason why all the world's memory supply is sitting in a warehouse, unused.
I have never been in an organization where everyone was sitting around, wondering what to do next. If the economy was actually as good as certain government officials claimed to be, we would be hiring people left and right to be able to do three times as much work, not firing.
That's the thing, profits and equities are at all time highs, but these companies have laid off 400k SWEs in the last 16 months in the US, which should tell you what their plans are for this technology and augmenting their businesses.
The last 16 months of layoffs are almost certainly not because of LLMs. All the cheap money went away, and suddenly tech companies have to be profitable. That means a lot of them are shedding anything not nailed down to make their quarter look better.
1 reply →
I always find these "anti-AI" AI believer takes fascinating. If true AGI (which you are describing) comes to pass, there will certainly be massive societal consequences, and I'm not saying there won't be any dangers. But the economics in the resulting post-scarcity regime will be so far removed from our current world that I doubt any of this economic analysis will be even close to the mark.
I think the disconnect is that you are imagining a world where somehow LLMs are able to one-shot web businesses, but robotics and real-world tech is left untouched. Once LLMs can publish in top math/physics journals with little human assistance, it's a small step to dominating NeurIPS and getting us out of our mini-winter in robotics/RL. We're going to have Skynet or Star Trek, not the current weird situation where poor people can't afford healthy food, but can afford a smartphone.
> We're going to have Skynet or Star Trek
Star Trek only got a good society after an awful war, so neither of these options are good.
Star Trek only got a good society after discovering FTL and existence of all manner of alien societies. And even after that Star Treks story motivations on why we turned good sound quite implausible given what we know about human nature and history. No effing way it will ever happen even if we discover aliens. Its just a wishful fever dream.
3 replies →
> Their goal is to monopolize labor for anything that has to do with i/o on a computer, which is way more than SWE. Its simple, this technology literally cannot create new jobs it simply can cause one engineer (or any worker whos job has to do with computer i/o) to do the work of 3, therefore allowing you to replace workers (and overwork the ones you keep). Companies don't need "more work" half the "features"/"products" that companies produce is already just extra. They can get rid of 1/3-2/3s of their labor and make the same amount of money, why wouldn't they.
Yes, that's how technology works in general. It's good and intended.
You can't have baristas (for all but the extremely rich), when 90%+ of people are farmers.
> ZeroHedge on twitter said the following:
Oh, ZeroHedge. I guess we can stop any discussion now..
The baristas example can only make me think that with the growing wealth disparity and no obvious exit path for white collars we might see a big return of servant-like jobs for below 1%. Who wouldn't want to wake up and daily assist life of some remaining upper-middle class Anthropic's employee?
Last I checked, the tractor and plow are doing a lot more work than 3 farmers, yet we've got more jobs and grow more food.
People will find work to do, whether that means there's tens of thousands of independent contractors, whether that means people migrate into new fields, or whether that means there's tens of multi-trillion dollar companies that would've had 200k engineers each that now only have 50k each and it's basically a net nothing.
People will be fine. There might be big bumps in the road.
Doom is definitely not certain.
America has lost over 50% of farms and farmers since 1900. Farming used to be a significant employer, and now it's not. Farming used to be a significant part of the GDP, and now it's not. Farming used to be politically significant... and not its complicated?.
If you go to the many small towns in farm country across the United States, I think the last 100 years will look a lot closer to "doom" than "bumps in the road". Same thing with Detroit when we got foreign cars. Same thing with coal country across Appalachia as we moved away from coal.
A huge source of American political tension comes from the dead industries of yester-year combined with the inability of people to transition and find new respectable work near home within a generation or two. Yes, as we get new technology the world moves on, but it's actually been extremely traumatic for many families and entire towns, for literally multiple generations.
Farming GDP has grown 2-3x since the 1900s. It's just everything else has grown even more. That doesn't make farming somehow irrelevant work. There's just more stuff to do now. This seems pretty consistent with OPs point.
Same thing with Walmart and local shops.
On the one hand, it brings a greater selection, at cheaper prices, delivered faster, to communities.
On the other hand, it steamrolls any competing businesses and extracts money that previously circulated locally (to shareholders instead).
1 reply →
What does that matter that a lot of people were farming? If anything that's a good argument for not worrying because we don't have 50%+ unemployment so clearly all those farming jobs were reallocated.
1 reply →
> Last I checked, the tractor and plow are doing a lot more work than 3 farmers, yet we've got more jobs and grow more food.
Not sure when you checked.
In the US more food is grown for sure. For example just since 2007 it has grown from $342B to $417B, adjusted for inflation[1].
But employment has shrunk massively, from 14M in 1910 to around 3M now[2] - and 1910 was well after the introduction of tractors (plows not so much... they have been around since antiquity - are mentioned extensively in the old testament Bible for example).
[1] https://fred.stlouisfed.org/series/A2000X1A020NBEA
[2] https://www.nass.usda.gov/Charts_and_Maps/Farm_Labor/fl_frmw...
That's his point. Drastically reducing agricultural employment didn't keep us from getting fed (and led to a significantly richer population overall -- there's a reason people left the villages for the industrial cities)
4 replies →
More jobs where? In farming? Is that why farming in the US is dying, being destroyed by corporations and farmers are now prisoners to John Deer? It’s hilarious that you chose possibly the worst counter example here…
More output, not more farmers. The stratification of labor in civilization is built on this concept, because if not for more food, we'd have more "farmer jobs" of course, because everyone would be subsistence farming...
1 reply →
Wow you are making a point of everything will be ok using farming ! Farming is struggling consolidated to big big players and subsidies keep it going
You get layed off and spend 2-3 years migrating to another job type what do you think g that will do to your life or family. Those starting will have a paused life those 10 fro retirement are stuffed.
> Its also worth noting that if you can create a business with an LLM, so can everyone else.
False. Anyone can learn about index ETFs and still yolo into 3DTE options and promptly get variation margined out of existence.
Discipline and contextual reasoning in humans is not dependent on the tools they are using, and I think the take is completely and definitively wrong.
*Checks Bio* Owns AI company and.... the whole family tree's portfolio :eyes:
So like....every business having electricity? I am not a economist so would love someone smarter than me explain how this is any different than the advent of electricity and how that affected labor.
An obvious argument to this is that electricity is becoming a lot more expensive (because of LLMs), so how is that going to affect labour?
The difference is that electricity wasn't being controlled by oligarchs that want to shape society so they become more rich while pillaging the planet and hurting/killing real human beings.
I'd be more trusting of LLM companies if they were all workplace democracies, not really a big fan of the centrally planned monarchies that seem to be most US corporations.
Heard of Carnegie? He controlled coal when it was the main fuel used for heating and electricity.
6 replies →
> The difference is that electricity wasn't being controlled by oligarchs that want to shape society so they become more rich while pillaging the planet and hurting/killing real human beings.
Yes it was. Those industrialists were called "robber barons" for a reason.
Its main distinction from previous forms of automation is its ability to apply reasoning to processes and its potential to operate almost entirely without supervision, and also to be retasked with trivial effort. Conventional automation requires huge investments in a very specific process. Widespread automation will allow highly automated organizations to pivot or repurpose overnight.
While I’m on your side electricity was (is?) controlled by oligarchs whose only goal was to become richer. It’s the same type of people that now build AI companies
Control over the fuels that create electricity has defined global politics, and global conflict, for generations. Oligarchs built an entire global order backed up by the largest and most powerful military in human history to control those resource flows, and have sacrificed entire ecosystems and ways of life to gain or maintain access.
So in that sense, yes, it’s the same
I mean your description sounds a lot like the early history of large industrialization of electricity. Lots of questionable safety and labor practices, proprietary systems, misinformation, doing absolutely terrible things to the environment to fuel this demand, massive monopolies, etc.
> And sadly everyone has the same ideas, everyone ends up working on the same things
This is someone telling you they have never had an idea that surprised them. Or more charitably, they've never been around people whose ideas surprised them. Their entire model of "what gets built" is "the obvious thing that anyone would build given the tools." No concept of taste, aesthetic judgment, problem selection, weird domain collisions, or the simple fact that most genuinely valuable things were built by people whose friends said "why would you do that?"
I'm speaking about the vast majority of people, who yes, build the same things. Look at any HN post over the last 6 months and you'll see everyone sharing clones of the same product.
Yes some ideas or novel, I would argue that LLMs destroy or atrophy the creative muscle in people, much like how GPS powered apps destroyed people's mental navigation "muscles".
I would also argue that very few unique valuable "things" built by people ever had people saying "Why would you build that". Unless we're talking about paradigm shifting products that are hard for people to imagine, like a vacuum cleaner in the 1800s. But guess what, llms aren't going to help you build those things.. They can create shitty images, clones of SaaS products that have been built 50x over, and all around encourage people to be mediocre and destroy their creativity as their brains atrophy from their use.
> They can get rid of 1/3-2/3s of their labor and make the same amount of money, why wouldn't they.
Because companies want to make MORE money.
Your hypothetical company is now competing with another company who didn’t opposite, and now they get to market faster, fix bugs faster, add feature faster, and responding to changes in the industry faster. Which results in them making more, while your employ less company is just status quo.
Also. With regards to oil, the consumption of oil increases as it became cheaper. With AI we now have a chance to do projects that simply would have cost way too much to do 10 years ago.
> Which results in them making more
Not necessarily.
You are assuming that the people can consume whatever is put in front of them. Markets get saturated fast. The "changes in the industry" mean nothing.
A) People are so used to infinite growth that it’s hard to imagine a market where that doesn’t exist. The industry can have enough developers and there’s a good chance we’re going to crash right the fuck into that pretty quickly. America’s industrial labor pool seemed like it provided an ever-expanding supply of jobs right up until it didn’t. Then, in the 80s, it started going backwards preeeetttty dramatically.
B) No amount of money will make people buy something that doesn’t add value to or enrich their lives. You still need ideas, for things in markets that have room for those ideas. This is where product design comes in. Despite what many developers think, there are many kinds of designers in this industry and most of them are not the software equivalent of interior decorators. Designing good products is hard, and image generators don’t make that easier.
3 replies →
> With AI we now have a chance to do projects that simply would have cost way too much to do 10 years ago.
Not sure about that, at least if we're talking about software. Software is limited by complexity, not the ability to write code. Not sure LLMs manage complexity in software any better than humans do.
The price of oil at the price of water (ecology apart) should be a good thing.
Automation should be, obviously, a good thing, because more is produced with less labor. What it says of ourselves and our politics that so many people (me included) are afraid of it?
In a sane world, we would realize that, in a post-work world, the owner of the robots have all the power, so the robots should be owned in common. The solution is political.
Throughout history Empires have bet their entire futures on the predictions of seers, magicians and done so with enthusiasm. When political leaders think their court magicians can give them an edge, they'll throw the baby out with the bathwater to take advantage of it. It seems to me that the Machine Learning engineers and AI companies are the court magicians of our time.
I certainly don't have much faith in the current political structures, they're uneducated on most subjects they're in charge of and taking the magicians at their word, the magicians have just gotten smarter and don't call it magic anymore.
I would actually call it magic though, just actually real. Imagine explaining to political strategists from 100 years ago, the ability to influence politicians remotely, while they sit in a room by themselves a la dictating what target politicians see on their phones and feed them content to steer them in a certain directions.. Its almost like a synthetic remote viewing.. And if that doesn't work, you also have buckets of cash :|
What do we “need” more of? Here in France we need more doctors, more nurseries, more teachers… I don’t see AI helping much there in short to middle term (with teachers all research points to AI making it massively worse even)
Globally I think we need better access to quality nutrition and more affordable medicine. Generally cheaper energy.
Isn’t the end game that all the displaced SWEs give up their cushy, flexible job and get retrained as nurses?
5 replies →
There is no such thing that you can always keep adding more of and have it automatically be effective.
I tend to automate too much because it's fun, but if I'm being objective in many cases it has been more work than doing the stuff manually. Because of laziness I tend to way overestimate how much time and effort it would took to do something manually if I just rolled my sleeved and simply did it.
Whether automating something actually produces more with less labor depends on nuance of each specific case, it's definitely not a given. People tend to be very biased when judging the actual productivity. E.g. is someone who quickly closes tickets but causes disproportionate amount of production issues, money losing bugs or review work on others really that productive in the end?
While I agree, I am not hopeful. The incentive alignment has us careening towards Elysium rather than Star Trek.
This is just a theory of mine, but the fact that people don't see LLMs as something that will grow the pie and increase their output leading to prosperity for all just means that real economic growth has stagnated.
From all my interactions with C-level people as an engineer, what I learned from their mindset is their primary focus is growing their business - market entry, bringing out new products, new revenue streams.
As an engineer I really love optimizing out current infra, bringing out tools and improved workflows, which many of my colleagues have considered a godsend, but it seems from a C-level perspective, it's just a minor nice-to-have.
While I don't necessarily agree with their world-view, some part of it is undeniable - you can easily build an IT company with very high margins - say 3x revenue/expense ratio, in this case growing the profit is a much more lucrative way of growing the company.
I don’t think we are running out of work to do… there seems to be an endless amount of work to be done. And most of it comes from human needs and desires.
Here is a very real example of how an LLM can at least save, if not create jobs, and also not take a programmers job:
I work for a cash-strapped nonprofit. We have a business idea that can scale up a service we already offer. The new product is going to need coding, possibly a full-scale app. We don't have any capacity to do it in-house and don't have an easy way to find or afford vendor that can work on this somewhat niche product.
I don't have the time to help develop this product but I'm VERY confident an LLM will be able to deliver what we need faster and at a lower cost than a contractor. This will save money we couldn't afford to gamble on an untested product AND potentially create several positions that don't currently exist in our org to support the new product.
There are ton's of underprivileged college grads or soon to be grads that could really use the experience, and pro bono work for a non profit would look really good on their CVs. Have you considered contacting a local university's CS department? This seems more valuable to society from a non profit's perspective, imo, than giving that money/work to an AI company. Its not like the students don't have access to these tools, and will be able to leverage them more effectively while getting the same outcome for you.
Do you have someone who can babysit and review what the LLM does? Otherwise, I'm not sure we're at the point where you can just tell an agent to go off and build something and it does it _correctly_.
IME, you'll just get demoware if you don't have the time and attention to detail to really manage the process.
But if you could afford to hire a worker for this job, that an LLM would be able to do for a fraction of the cost (by your estimation), then why on earth would you ever waste money on a worker? By extension if you pay a worker and an AI or robot comes along that can do the work for cheaper, then why would you not fire the worker and replace them with the cheaper alternative?
Its kind of funny to see capitalists brains all over this thread desperately try to make it make sense. It's almost like the system is broken, but that can't possibly be right everybody believes in capitalism, everybody can't be wrong. Wake the fuck up.
New people hired for this project would not be coders. They would be an expert in the service we offer, and would be doing work an LLM is not capable of.
I don't know if LLMs would be capable of also doing that job in the future, but my org (a mission-driven non profit) can get very real value from LLMs right now, and it's not a zero-sum value that takes someone's job away.
1 reply →
> They can get rid of 1/3-2/3s of their labor and make the same amount of money, why wouldn't they.
Competition may encourage companies to keep their labor. For example, in the video game industry, if the competitors of a company start shipping their games to all consoles at once, the company might want to do the same. Or if independent studios start shipping triple A games, a big studio may want to keep their labor to create quintuple A games.
On the other hand, even in an optimistic scenario where labor is still required, the skills required for the jobs might change. And since the AI tools are not mature yet, it is difficult to know which new skills will be useful in ten years from now, and it is even more difficult to start training for those new skills now.
With the help of AI tools, what would a quintuple A game look like? Maybe once we see some companies shipping quintuple A games that have commercial success, we might have some ideas on what new skills could be useful in the video game industry for example.
Yeah but there’s no reason to assume this is even a possibility. SW Companies that are making more money than ever are slashing their workforces. Those garbage Coke and McDonald’s commercials clearly show big industry is trying to normalize bad quality rather than elevate their output. In theory, cheap overseas tweening shops should have allowed the midcentury American cartoon industry to make incredible quality at the same price, but instead, there was a race straight to the bottom. I’d love to have even a shred of hope that the future you describe is possible but I see zero empirical evidence that anyone is even considering it.
> everyone has access to the same models and basic thought processes
Why haven't Warners acquired Netflix then, but the other way around? Even though they had access to the same labor market, a human LLM replacement?
I think real economics is a little more complex than the "basic economics" referenced in your reply.
This does not negate the possibility that enterprises will double down on replacing everyone with AI, though. But it does negate the reasoning behind the claim and the predictions made.
> Its also worth noting that if you can create a business with an LLM
If that were true, LLM companies would just use it themselves to make money rather than sell and give away access to the models at a loss.
> Their goal is to monopolize labor for anything that has to do with i/o on a computer, which is way more than SWE. Its simple, this technology literally cannot create new jobs it simply can cause one engineer (or any worker whos job has to do with computer i/o) to do the work of 3, therefore allowing you to replace workers (and overwork the ones you keep). Companies don't need "more work" half the "features"/"products" that companies produce is already just extra. They can get rid of 1/3-2/3s of their labor and make the same amount of money, why wouldn't they.
Most companies have "want to do" lists much longer than what actually gets done.
I think the question for many will be is it actually useful to do that. For instance, there's only so much feature-rollout/user-interface churn that users will tolerate for software products. Or, for a non-software company that has had a backlog full of things like "investigate and find a new ERP system", how long will that backlog be able to keep being populated.
> Its also worth noting that if you can create a business with an LLM, so can everyone else. And sadly everyone has the same ideas, everyone ends up working on the same things causing competition to push margins to nothing.
This was true before LLMs. For example, anyone can open a restaurant (or a food truck). That doesn't mean that all restaurants are good or consistent or match what people want. Heck, you could do all of those things but if your prices are too low then you go out of business.
A more specific example with regards to coding:
We had books, courses, YouTube videos, coding boot camps etc but it's estimated that even at the PEAK of developer pay less than 5% of the US adult working population could write even a basic "Hello World" program in any language.
In other words, I'm skeptical of "everyone will be making the same thing" (emphasis on the "everyone").
> And sadly everyone has the same ideas
I'm not sure that's true. If LLMs can help researchers implement (not find) new ideas faster, they effectively accelerate the progress of research.
Like many other technologies, LLMs will fail in areas and succeed in others. I agree with your take regarding business ideas, but the story could be different for scientific discovery.
One thing that's clear, LLMs cannot come up with novel ideas.
> Its also worth noting that if you can create a business with an LLM, so can everyone else. And sadly everyone has the same ideas
Yeah, people are going to have to come to terms with the "idea" equivalent of "there are no unique experiences". We're already seeing the bulk move toward the meta SaaS (Shovels as a Service).
Its also worth noting that if you can create a business with an LLM, so can everyone else.
One possibility may be that we normalize making bigger, more complex things.
In pre-LLM days, if I whipped up an application in something like 8 hours, it would be a pretty safe assumption that someone else could easily copy it. If it took me more like 40 hours, I still have no serious moat, but fewer people would bother spending 40 hours to copy an existing application. If it took me 100 hours, or 200 hours, fewer and fewer people would bother trying to copy it.
Now, with LLMs... what still takes 40+ hours to build?
The arrow of time leads towards complexity. There is no reason to assume anything otherwise.
There's an older article that gets reposted to HN occasionally, titled something like "I hate almost all software". I'm probably more cynical than the average tech user and I relate strongly to the sentiment. So so much software is inexcusably bad from a UX perspective. So I have to ask, if code will really become this dirt cheap unlimited commodity, will we actually have good software?
Depends on whether you think good software comes from good initial design (then yes, via the monkeys with typewriters path) or intentional feature evolution (then no, because that's a more artistic, skilled endeavor).
Anyone who lived through 90s OSS UX and MySpace would likely agree that design taste is unevenly distributed throughout the population.
I don't disagree with everything you are saying. But you seem to be assuming that contributing to technology is a zero sum game when it concretely grows the wealth of the world.
> If everyone had an oil well on their property that was affordable to operate the price of oil would be more akin to the price of water.
This is not necessarily even true https://en.wikipedia.org/wiki/Jevons_paradox
Jevon's Paradox is know as a paradox for a reason. It's not "Jevon's Law that totally makes sense and always happens".
If one person can do the job of three, then you can keep output the same and reduce headcount, or maintain headcount and improve output etc.
Anecdotally it seems demand for software >> supply of software. So in engineering, I think we’ll see way more software. That’s what happened in the Industrial Revolution. Far more products, multiple orders of magnitude more, were produced.
The Industrial Revolution was deeply disruptive to labour, even whilst creating huge wealth and jobs. Retraining is the real problem. That’s what we will see in software. If you can’t architect and think well, you’ll struggle. Being able to write boiler plate and repetitive low level code is a thing of the past. But there are jobs - you’re going to have to work hard to land them.
Now, if AGI or superintelligence somehow renders all humans obsolete, that is a very different problem but that is also the end of capitalism so will be down to governments to address.
Which leads to the uncomfortable but difficult to avoid conclusion that having some friction in the production of code was actually helping because it was keeping people from implementing bad ideas.
I have a few app ideas that I've been sitting on for years and they would all be things that would help me, things that I would actually use.. But they're also things that I think others would find useful. I had Claude Code create two of them so far, and yeah the code isn't what I would write, but the apps generally work and are useful to me. The idea of trying to monetize these apps that I didn't even write is strange to me, especially considering anyone else can just tell their Claude Code to "create an app that's a clone of appwebsite.com" and within an hour they will probably have a virtually identical clone of my app that I'm trying to charge money for.
In this way, AI coding is a bummer. I also sincerely miss writing code. Merely reading it (or being a QA and telling Claude about bugs I find) is a shell of what software engineering used to be.
I know with apps especially, all that really matters is how large your user base is, but to spend all that time and money getting the user base, only for them to jump ship next month for an even better vibe-coded solution... eh. I don't have any answers, I just agree that everyone has the same ideas and it's just going to be another form of enshittification. "My AI slop is better than your AI slop".
Retail water[1] costs $881/bbl which is 13x the price of Brent crude.
[1] https://www.walmart.com/ip/Aquafina-Purified-Drinking-Water-...
What a good faith reply. If you sincerely believe this, that's a good insight into how dumb the masses are. Although I would expect a higher quality of reply on HN.
You found the most expensive 8pck of water on Walmart. Anyone can put a listing on Walmart, its the same model as Amazon. There's also a listing right below for bottles twice the size, and a 32 pack for a dollar less.
It cost $0.001 per gallon out of your tap, and you know this..
I'm in South Australia, the driest state on the driest continent, we have a backup desalination plant and water security is common on the political agenda - water is probably as expensive here than most places in the world
"The 2025-26 water use price for commercial customers is now $3.365/kL (or $0.003365 per litre)"
https://www.sawater.com.au/my-account/water-and-sewerage-pri...
Water just comes out of a tap?
My household water comes from a 500 ft well on my property requiring a submersible pump costing $5000 that gets replaced ever 10-15 years or so with a rig and service that cost another 10k. Call it $1000/year... but it also requires a giant water softener, in my case a commercial one that amortizes out to $1000/year, and monthly expenditure of $70 for salt (admittedly I have exceptionally hard water).
And of course, I, and your municipality too, don't (usually) pay any royalties to "owners" of water that we extract.
Water is, rightly, expensive, and not even expensive enough.
2 replies →
Just for completeness, it's about $0.023/gal in Pittsburgh (1)-- still perfectly affordable but 23x more than 0.001. but still 50x less than Brent crude.
(1) Combined water+ sewer fees. Sewer charges are based on your water consumption so it rolls into the per-gallon effective price. https://www.pgh2o.com/residential-commercial-customers/rates
decreasing COGS creates wealth and consumer surplus, though.
If we can flatten the social hierarchy to reduce the need for social mobility then that kills two birds with one stone.
Do you really think the ruling class has any plans to allow that to happen... There's a reason so much surveillance tech is being rolled out across the world.
If the world needs 1/3 of the labor to sustain the ruling class's desires, they will try to reduce the amount of extra humans. I'm certain of this.
My guess is during this "2nd industrial revolution" they will make young men so poor through the alienation of their labor that they beg to fight in a war. In that process they will get young men (and women) to secure resources for the ruling class and purge themselves in the process.
In a simplified economic model though.
Reply to your edit: what if we wanted to do with the water was simply to drink it?
"Meritocratic climbing on the social ladder", I'm sorry but what are you on about?? As if that was the meaning in life? As if that was even a goal in itself?
If it's one thing we need to learn in the age of AI, it's not to confuse the means to an end and the end itself!
Yeah, but a Stratocaster guitar is available to everybody too, but not everybody’s an Eric Clapton
I can buy the CD From the Cradle for pennies, but it would cost me hundreds of dollars to see Eric Clapton live
This is correct. An LLM is a tool. Having a better guitar doesn’t make you sound good if you don’t know how to play. If you were a low skill software systems etc arch before LLM you’re gonna be a bad one after as well. Someone at some point is deciding what the agent should be doing. LLMs compete more with entry level / juniors.
[dead]
[dead]