Comment by GoatInGrey
2 days ago
You are welcome to share how AI has transformed a revenue generating role. Personally, I have never seen a durable example of it, despite my excitement with the tech.
In my world, AI has been little more than a productivity boost in very narrowly scoped areas. For instance, generating an initial data mapping of source data against a manually built schema for the individual to then review and clean up. In this case, AI is helping the individual get results faster, but they're still "doing" data migrations themselves. AI is simply a tool in their toolbox.
What you've described is reasonable and a clear takeaway is that AI is a timesaving tool you should learn.
Where i share concern with the parent is the claims that AI is useless which isn't coming from your post at all but i have definitely seen instances of it in the programmer community still to this day. As in the parents concern that some programmers are missing the train is unfortunately completely warranted.
I went through the parents, looking for a claim somewhere that AI was "useless." I couldn't find it.
Yes there are lots of skeptics amongst programmers when it comes to AI. I was one myself (and still am depending on what we're talking about). My skepticism was rooted in the fact that AI is trained on human-generated output. Most human written code is not very good, and so AI is going to produce not very good code by design because that's what it was trained on.
Then you add to that the context problem. AI is not very good at understanding your business goals, or the nuanced intricacies of your problem domain.
All of this pointed to the fact, very early on, that AI would not be a good tool to replace programmers. And THAT'S the crux of why so many programmers pushed back. Because the hype was claiming that automation was coming for engineering jobs.
I have started to use LLMs regularly for a variety of tasks. Including some with engineering. But I always end up spending a lot of time refactoring what LLMs produce for me, code-wise. And much of the time I find that I"m still learning what the LLMs can do for me that truly saves me time, vs what would have been faster to just write myself in the first place.
LLMs are not useless. But if only 20% of a programmer's time is actually spent writing code on average then even if you can net a 50% increase in coding productivity... you're only netting a 10% overall productivity optimization for an engineer BEST CASE SCENARIO.
And that's not "useless" but compared to the hype and bullshit coming out of the mouths of CEOs, it's as good as useless. It's as good as the MIT study finding that only 5% of generative AI projects have netted ANY measurable returns for the business.
I know a company that replaced their sales call center with an AI calling bot instead. The bot got better sales and higher feedback scores from customers.
And I happen to know a different company that regrets their decision to do something similar:
https://tech.co/news/klarna-reverses-ai-overhaul
Is my anecdotal evidence any better than yours?
I'm going to interpret the two stories as "50% businesses find LLMs useful" (sample size 2).
I would argue yes, because you provided a source and a verifiable company name
I know a company that did the same and lost billions of dollars