Comment by shaky-carrousel

6 months ago

Hours of time saved, and you learned nothing in the process. You are slowly becoming a cog in the LLM process instead of an autonomous programmer. You are losing autonomy and depending more and more on external companies. And one day will come that, with all that power, they'll set whatever price or conditions they want. And you will accept. That's the future. And it's not inevitable.

Did you build the house you live in? Did you weave your own clothes or grow your own food?

We all depend on systems others built. Determining when that trade-off is worthwhile and recognizing when convenience turns into dependence are crucial.

  • Did you write your own letters? Did you write your own arguments? Did you write your own code? I do, and don't depend on systems other built to do so. And losing the ability of keep doing so is a pretty big trade-off, in my opinion.

    • There seems to be a mistaken thought that having an AI (or indeed someone else) help you achieve a task means you aren't learning anything. This is reductionist. I suggest instead that it's about degrees of autonomy. The person you're responding to made a choice to get the AI to help integrate a library. They chose NOT to have the AI edit the files itself; they rather spent time reading through the changes and understanding the integration points, and tweaking the code to make it their own. This is much different to vibe coding.

      I do a similar loop with my use of AI - I will upload code to Gemini 2.5 Pro, talk through options and assumptions, and maybe get it to write some or all of the next step, or to try out different approaches to a refactor. Integrating any code back into the original source is never copy-and-paste, and that's where the learning is. For example, I added Dexie (a library/wrapper for accessing IndexedDB) to a browser extension project the other day, and the AI helped me get started with a minimal amount of initial knowledge, yet I learned a lot about Dexie and have been able to expand upon the code myself since. If I were on my own, I would probably have barrelled ahead and just used IndexedDB directly, resulting in a lot more boilerplate code and time spent doing busywork. It's this sort of friction reduction that I find most liberating about AI. Trying out a new library isn't a multi-hour slog; instead, you can sample it and possibly reject it as unsuitable almost immediately without having to waste a lot of time on R&D. In my case, I didn't learn 'raw' IndexedDB, but instead I got the job done with a library offering a more suitable level of abstraction, and saved hours in the process.

      This isn't lazy or giving up the opportunity to learn, it's simply optimising your time.

      The "not invented here" syndrome is something I kindly suggest you examine, as you may find you are actually limiting your own innovation by rejecting everything that you can't do yourself.

      4 replies →

    • Unless you're writing machine code, you aren't really writing your own code either. You're giving high level instructions, which depend on many complex systems built by thousands of engineers to actually run.

      6 replies →

    • > Did you write your own letters? Did you write your own arguments? Did you write your own code? I do, and don't depend on systems other built to do so. And losing the ability of keep doing so is a pretty big trade-off, in my opinion.

      Gatekeeping at it's finest, you're not a "true" software engineer if you're not editing the kernel on your own, locked in in a cubicle, with no external help.

      1 reply →

  • We're talking about a developer here so this analogy does not apply. If a developer doesn't actually develop anything, what exactly is he?

    > We all depend on systems others built. Determining when that trade-off is worthwhile and recognizing when convenience turns into dependence are crucial.

    I agree with this and that's exactly what OP is saying: you're now a cog in the LLM pipeline and nothing else.

    If we lived in a saner world this would be purely a net positive but in our current society it simply means we'll get replaced for the cheaper alternative the second it becomes viable, making any dependence to it extremely risky.

    It's not only for individuals too. What happens when our governments are now dependent on LLMs from these private corporations to function and they start the enshitification phase?

    • > We're talking about a developer here so this analogy does not apply. If a developer doesn't actually develop anything, what exactly is he?

      A problem solver

      1 reply →

> and you learned nothing in the process.

why do you presume the person wanted to learn something, rather than to get the work done asap? May be they're not interested in learning, or may be they have something more important to do, and saving this time is a life saver?

> You are losing autonomy and depending more and more on external companies

do you also autonomously produce your own clean water, electricity, gas and food? Or do you rely on external companies to provision all of those things?

  • The pretty big difference is that I'm not easily able to produce my electricity or food. But I'm easily able to produce my code. We are losing autonomy we already have, just for pure laziness, and it will bite us.

> Hours of time saved, and you learned nothing in the process

Point and click "engineer" 2.0

We all know this.

Eventually someone has to fix the mess and it won't be him. He will be management by then.

  • > We all know this

    Unfortunately, reading this thread and many other comments on similar articles, it seems like many of us have no clue about this

    We are in for a rough ride until we figure this out