Comment by jalapenos
12 hours ago
The dumb part of this is: so who prompts the AI?
Well probably we'd want a person who really gets the AI, as they'll have a talent for prompting it well.
Meaning: knows how to talk to computers better than other people.
So a programmer then...
I think it's not that people are stupid. I think there's actually a glee behind the claims AI will put devs out of work - like they feel good about the idea of hurting them, rather than being driven by dispassionate logic.
Maybe it's the ancient jocks vs nerds thing.
Outside of SV the thought of More Tech being the answer to ever greater things is met with great skepticism these days. It's not that people hate engineers, and most people are content to hold their nose while the mag7 make 401k go up, but people are sick of Big Tech. Like it or not, the Musks, Karps, Thiels, Bezos's have a lot to do with that.
Popularity gets you nowhere though. What matters is money and money. Those 401k holders are tied down to the oligarchy.
Not imputing that to you, but it seems like they are people out there that believe money is all that matters. The map with the richest details won't save anyone in a territory that was turned into a wasteland unable to produce a single apple on the whole land.
1 reply →
Devs are where projects meet the constraints of reality and people always want to kill the messenger.
No high paid manager wants to learn that their visionary thinking was just the last iteration of the underpants gnome meme. Some things sound good at first but unfortunately are not that easy to actually do
Devs are where the project meets reality in general, and this is what I always try to explain to people. And it's the same with construction, by the way. Pictures and blueprints are nice but sooner or later you're going to need someone digging around in the dirt.
Some people just see it as a cost, one "tech" startup I worked at I got this lengthy pitch from a sales exec that they shouldn't have a software team at all, that we'd never be able to build anything useful without spending millions and that money would be better-spent on the sales team, although they'd have nothing to sell lmfao. And the real laugh was the dev team was heavily subsidized by R&D grants anyway.
Even that is the wrong question. The whole promise of the stock market, of AI is that you can "run companies" by just owning shares and knowing nothing at all. I think that is what "leaders" hope to achieve. It's a slightly more dressed get-rich-quick scheme.
Invest $1000 into AI, have a $1000000 company in a month. That's the dream they're selling, at least until they have enough investment.
It of course becomes "oh, sorry, we happen to have taken the only huge business for ourselves. Is your kidney now for sale?"
> Invest $1000 into AI, have a $1000000 company in a month. That's the dream they're selling, at least until they have enough investment.
But you need to buy my AI engineer course for that first.
Who fixes the unmaintainable mess that the AI created in which the vibe coder prompted?
The Vibe Coder? The AI?
Take a guess who fixes it.
The real question is, do you even need to fix it? Does it matter?
The reason those things matter in a traditional project is because a person needs to be able to read and understand the code.
If you're vibe coding, that's no longer true. So maybe it doesn't matter. Maybe the things we used to consider maintenance headaches are irrelevant.
If these things can ever actually think and understand a codebase this mindset makes sense, but as of now it's a short-sighted way to work. The quality of the output is usually not great, and in some cases terrible. If you're just blindly accepting code with no review, eventually things are going to implode, and the AI is more limited than you are in understanding why. It's not going to save you in it's current form.
The reason those things matter in a traditional project is because the previous developers fucked up, and the product is now crashing and leaking money and clients like a sinking Titanic.
For now, training these things on code and logic is the first step of building a technological singularity.
They don't need to put all developers out of work to have a financial impact on the career.
How about another AI? And who prompts that AI? You're right - another AI!
With all these AIs chaining and prompting eachother, we're approaching the point where some unlucky person is going to ask an AI something and it will consume all the energy in the universe trying to compute the answer.
Only to get in response: “INSUFFICIENT DATA FOR MEANINGFUL ANSWER”
The answer would be 42.
The day you successfully implemented your solution with a prompt, you solution is valued at the cost of a prompt. There is no value to anything easily achieved by generative tools anymore. Now it is in either:
a. generative technology but requiring substantial amount of coordination, curation, compute power. b. substantial amount of data. c. scarce intelectual human work.
And scarce but non intellectually demanding human work was dropped from the list of valuable things.
> who prompts the AI
LLMs are a box where the input has to be generated by someone/something, but also the output has to be verified somehow (because, like humans, it isn't always correct). So you either need a human at "both ends", or some very clever AI filling those roles.
But I think the human doing those things probably needs slightly different skills and experience than the average legacy developer.
Rules engines were designed for just such a thing. Validating input/output. You don’t need a human to prompt AI, you need a pipeline.
While a single LLM won’t replace you. A well designed system of flows for software engineering using LLMs will.
Well, who designs the system of flows?
2 replies →