← Back to context

Comment by whiplash451

4 days ago

Honestly at this point if you’re a dev and not using a coding assistant at all and in any possible way, you are indeed becoming a liability to your company.

Now, Copilot (the assistant, not the GitHub coding one) is hot garbage compared to Claude/ChatGPT but that’s another story.

This is crazy. I use copilot completions occasionally, but on average I think it's been productivity neutral so far. Sometimes it helps, but this is roughly offset by fruitless rabbit holes and straight-up wrong information.

One of my co-workers never uses any of it. For certain types of problems, he's the most productive member on the team.

  • > One of my co-workers never uses any of it. For certain types of problems, he's the most productive member on the team.

    Sounds like he needs a PIP!

  • Last time somebody asked me at work if I was using it, I had just turned off the Visual Studio normal autocompletion because the setting to disable completion on space, enter, dot, etc disappeared.

    MS productivity tools are a serious hindrance.

On the other hand, many many (usually junior) engineers that have unfiltered access to coding assistants have basically become huge liabilities over night with tools like claude code.

That's only true if you're making CRUD software and easily replaceable by any random programmer. For anything more serious LLMs are only useful as a better search engine.

> you are indeed becoming a liability to your company

A risk that stock price declines for not buying into the hype. Actual productivity is not a concern.

I haven't noticed coding assistance improving quality devs in a positive way outside of perhaps occasionally saving on typing speed... and those opportunities are few and far between

  • I've found it detrimental to quality because people at my workplace are now submitting code changes in languages they don't really understand (like in multithreaded C++ code). The time they saved is taken from my time having to explain all the problems with their code.

Is that why any useful software was written before 2023?

I notice that you said "using" though and did not specify useful output. Useful output or even just a GitHub repo is kryptonite for "AI" proponents.

If you mean that Microsoft has to pretend that its employees are using "AI" in order to keep the P/E ratio of roughly 40, then of course an employee who does not participate in the con becomes a liability and you are absolutely right!

>> Honestly at this point if you’re a dev and not using a coding assistant at all and in any possible way, you are indeed becoming a liability to your company.

Honestly, you were probably a liability to your company prior to AI. Now you can at least vibe-code. </sarcasm> I don't know anything about your situation, nor do you know mine.

If you are submitting AI Slop code or overuse AI then you are a liability.

If you don't use AI at all you might be 5 or 10% slower, but quality might well even make up for it.

  • I would say even more important than quality is just the fact that you will probably understand better. If an issue arises you will already know the intricacies while the AI slop engineer almost has to ask the AI again. They have got themselves stuck in a sort of slop cycle where after some time human understanding is completely out of the picture.

    • AI does not save anyone from having to know the fundamentals of software engineering and systems, but that’s orthogonal.

      If you know them, AI supercharges you. If you don’t, you’re a lost cause no matter what.

      1 reply →

    • The sad thing is management often just doesn't care about those kinds of things.

      For instance: my employer seems actively hostile to maintaining human understanding, even before AI. Ownership of apps moved around without sufficient knowledge transition or training. We've migrated Wiki systems 2 or 3 times over my career, and stuff always gets lost. The last migration (to SharePoint) was downright hostile: it was presented as an opportunity to "clean up," the half-ass automated migration deliberately excluded things more than a year old, and your docs got nuked unless someone was paying attention to save them (not a given). Now that SharePoint is in the cloud, its admins are actively scanning for things to delete, because the priority is minimizing their storage costs, not, you know maintaining knowledge of how things work.