← Back to context

Comment by paxys

4 days ago

Leadership - "All employees are required to use AI and be X% more productive"

Employees find that AI tools are useless and don't increase productivity.

Leaders say - You are using the tools wrong. Figure it out.

Employees now have to work longer hours or risk getting fired.

Company does layoffs under the guise of "we replaced workers with AI" and the stock market rewards them for it.

> Leaders say - You are using the tools wrong. Figure it out.

> Employees now have to work longer hours or risk getting fired.

Bahahaha. In the best of all possible worlds.

Likely reality: Leaders say you're using the tools wrong, figure it out.

Employees: OK, so we train on the tools after hours?

Leaders: No.

Employees: Before hours?

Leaders: No.

Employees: During lunch?

Leaders: No.

Employees: Then when are we supposed to learn how to use the bloody tools?

Leaders: You're just going to have to figure that out for yourself.

Source: Happened to my mom when they moved from mainframe to Web-based at the insurance company.

Maybe I could create a grift company that tells other companies why they are using AI wrong.

After working for a few weeks at high hourly rate, produce a report. Then grift the next suites.

  • > After working for a few weeks at high hourly rate, produce a report. Then grift the next suites.

    Bonus points if you just have ChatGPT spit out the report :D

Honestly at this point if you’re a dev and not using a coding assistant at all and in any possible way, you are indeed becoming a liability to your company.

Now, Copilot (the assistant, not the GitHub coding one) is hot garbage compared to Claude/ChatGPT but that’s another story.

  • This is crazy. I use copilot completions occasionally, but on average I think it's been productivity neutral so far. Sometimes it helps, but this is roughly offset by fruitless rabbit holes and straight-up wrong information.

    One of my co-workers never uses any of it. For certain types of problems, he's the most productive member on the team.

    • > One of my co-workers never uses any of it. For certain types of problems, he's the most productive member on the team.

      Sounds like he needs a PIP!

    • Last time somebody asked me at work if I was using it, I had just turned off the Visual Studio normal autocompletion because the setting to disable completion on space, enter, dot, etc disappeared.

      MS productivity tools are a serious hindrance.

  • On the other hand, many many (usually junior) engineers that have unfiltered access to coding assistants have basically become huge liabilities over night with tools like claude code.

  • That's only true if you're making CRUD software and easily replaceable by any random programmer. For anything more serious LLMs are only useful as a better search engine.

  • > you are indeed becoming a liability to your company

    A risk that stock price declines for not buying into the hype. Actual productivity is not a concern.

  • I haven't noticed coding assistance improving quality devs in a positive way outside of perhaps occasionally saving on typing speed... and those opportunities are few and far between

    • I've found it detrimental to quality because people at my workplace are now submitting code changes in languages they don't really understand (like in multithreaded C++ code). The time they saved is taken from my time having to explain all the problems with their code.

  • Is that why any useful software was written before 2023?

    I notice that you said "using" though and did not specify useful output. Useful output or even just a GitHub repo is kryptonite for "AI" proponents.

    If you mean that Microsoft has to pretend that its employees are using "AI" in order to keep the P/E ratio of roughly 40, then of course an employee who does not participate in the con becomes a liability and you are absolutely right!

  • >> Honestly at this point if you’re a dev and not using a coding assistant at all and in any possible way, you are indeed becoming a liability to your company.

    Honestly, you were probably a liability to your company prior to AI. Now you can at least vibe-code. </sarcasm> I don't know anything about your situation, nor do you know mine.

  • If you are submitting AI Slop code or overuse AI then you are a liability.

    If you don't use AI at all you might be 5 or 10% slower, but quality might well even make up for it.

    • I would say even more important than quality is just the fact that you will probably understand better. If an issue arises you will already know the intricacies while the AI slop engineer almost has to ask the AI again. They have got themselves stuck in a sort of slop cycle where after some time human understanding is completely out of the picture.

      3 replies →