Comment by viraptor
3 days ago
> It's hard to think of any other major tech product where it's acceptable to shift so much blame on the user.
It's completely normal in development. How many years of programming experience you need for almost any language? How many days/weeks you need to use debuggers effectively? How long from the first contact with version control until you get git?
I think it's the opposite actually - it's common that new classes of tools in tech need experience to use well. Much less if you're moving to something different within the same class.
> LLMs, especially at the scale we see today
The OP qualifies how the marketing cycle for this product is beyond extreme, and its own category.
Normal people are being told to worry about AI ending the world, or all jobs disappearing.
Simply saying “the problem is the user”, without acknowledging the degree of hype, and expectation setting, the is irresponsible.
AI marketing isn't extreme - not on the LLM vendor side, at least; the hype is generated downstream of it, for various reasons. And it's not the marketing that's saying "you're using it wrong" - it's other users. So, unless you believe everyone reporting good experience with LLMs is a paid shill, there might actually be some merit to it.
You have to be pretty native to think VC’s don’t astroturf forums and let random mobs steer discussions about their investments. Even dinosaurs like Microsoft have been caught doing exactly that many time. Including fake “letters to the editor” campaigns when newspapers were a thing
It is extreme, and on the vendor side. The OpenAI non profit vs profit saga, was about profit seeking vs the future of humanity. People are talking about programming 3.0.
I can appreciate that it’s other users who are saying it’s wrong, but that doesn’t escape the point on ignoring the context.
Moreover, it’s unhelpful communication. Its gives up acknowledging a mutually shared context, the natural confusion that would arise from the ambiguous, high level hype, and the actual down to earth reality.
Even if you have found a way to make it work, having someone understand your workflow can’t happen without connecting the dots between their frame of reference and yours.
1 reply →
I think the relentless podcast blitz by OpenAI and Anthropic founders suggests otherwise. They're both keen to confirm that yes, in 5 - 10 years, no one will have any jobs any more. They're literally out there discussing a post employment world like it's an inevitability.
That's pretty extreme.
2 replies →
> And it's not the marketing that's saying "you're using it wrong" - it's other users.
No, it's the non-coding managers who vibe-coded a half-working prototype, not other users. And here, the Dunning-Kruger effect is at play - those non-coding types do not understand that AI is not working for them either.
Full disclosure: I do rely on vibe-coded jq lines in one-off scripts that will definitely not process more data after the single intended use, and this is where AI saves my time.
It's called grassroots marketing. It works particularly well in the context of GenAI because it is fed with esoteric and ideological fragments that overlap with common beliefs and political trends. https://en.wikipedia.org/wiki/TESCREAL
Therefore, classical marketing is less dominant, although more present at down-stream sellers.
2 replies →
It is completely typical, but at the same time abnormal to have tools with such poor usability.
A good debugger is very easy to use. I remember the Visual Studio debugger or the C++ debugger on Windows were a piece of cake 20 years ago, while gdb is still painful today. Java and .NET had excellent integrated debuggers while golang had a crap debugging story for so long that I don’t even use a debugger with it. In fact I almost never use debuggers any more.
Version control - same story. CVS for all its problems I had learned to use almost immediately and it had a GUI that was straightforward. git I still have to look up commands for in some cases. Literally all the good git UIs cost a non-trivial amount of money.
Programming languages are notoriously full of unnecessary complexity. Personal pet peeve: Rust lifetime management. If this is what it takes, just use GC (and I am - golang).
> git I still have to look up commands for in some cases
I believe that this is okay. One does not need to know the details about every specific git command in order to be able to use it efficiently most of the time.
It is the same with a programming language. Most people are unfamiliar with every peculiarity of every standard library function that the language offers. And that is okay. It does not prevent them from using language efficiently most of the time.
Also in other aspects of life, it is unnecessary to know everything by memory. For example, one does not need to know how to e.g. replace a blade on a lawn mower. But that is okay. It does not prevent them from using it efficiently most of the time.
The point is that if something is done less often, it is unnecessary to remember the specifics of it. It is fine to look it up when needed.
> It is completely typical, but at the same time abnormal to have tools with such poor usability.
The main difference I see is that LLMs are flaky, getting better over time, but still more so than traditional tooling like debuggers.
> Programming languages are notoriously full of unnecessary complexity. Personal pet peeve: Rust lifetime management. If this is what it takes, just use GC (and I am - golang).
Lifetime management is an inherently hard problem, especially if you need to be able to reason about it at compile time. I think there are some arguments to be made about tooling or syntax making reasoning about lifetimes easier, but not trivial. And in certain contexts (e.g., microcontrollers) garbage collectors are out of the question.
Nitpick: magit for emacs is good enough for everyone whom I’ve seen talk about it describe as “the best git correct” and it is completely free.
Linus did not show up in front of congress talking about how dangerously powerful unregulated version control was to the entirety of human civilization a year before he debuted Git and charged thousands a year to use it.
This seems like a non sequitur. What does this have to do with this thread?
It is completely reasonable to hold cursor/claude to a different standard than gdb or git.
1 reply →
Ok. You seem to be taking about a completely different issue of regulation.
Hmmm, I don't see it? Are debuggers hard to use? Sometimes. But the debugger is allowing you to do something you couldn't actually do before. i.e. set breakpoints, and step through your code. So, while tricky to use, you are still in a better position than not having it. Just because you can get better at using something doesn't automatically mean that using it as a beginner makes you worse off.
Same can be said for version control and programming.
i guarantee you there were millions of people that needed to be forced to use excel because they thought they could do the calculations faster by hand.
we retroactively assume that everyone just obviously adopts new technology, yet im sure there were tons and tons of people that retired rather than learning how computers worked when the PC revolution was happening.
> How many days/weeks you need to use debuggers effectively
I understand your point, but would counter with: gdb isn't marketed as a cuddly tool that can let anyone do anything.