Comment by PeterStuer
8 days ago
You are both right. B2B for instance is mostly fairly template stuff built from CRUD and some business rules. Even some of the more perceived as 'creative' niches such as music scoring or 3D games are fairly route interactions with some 'engine'.
And I'm not even sure these 'template adjacent' regurgitations are what the crude LLM is best at, as the output needs to pass some rigorous inflexible test to 'pass'. Hallucinating some non-existing function in an API will be a hard fail.
LLM's have a far easier time in domains where failures are 'soft'. This is why 'Elisa' passed as a therapist in the 60's, long before auto-programmers were a thing.
Also, in 'academic' research, LLM use has reached nearly 100%, not just for embelishing writeups to the expected 20 pages, but in each stage of the'game' including 'ideation'.
And if as a CIO you believe that your prohibition on using LLMs for coding because of 'divulging company secrets' holds, you are either strip searching your employees on the way in and out, or wilfully blind.
I'm not saing 'nobody' exists that is not using AI in anything created on a computer, just like some woodworker still handcrafts exclusive bespoke furniture in a time of presses, glue and CNC, but adoption is skyrocketing and not just because the C-suite pressures their serves into using the shiny new toy.
> "And if as a CIO you believe that your prohibition on using LLMs for coding because of 'divulging company secrets' holds, you are either strip searching your employees on the way in and out, or wilfully blind."
Right so if you are in certain areas you'll be legally required not to send your work to whatever 3:rd party that promises to handle it the cheapest.
Also so since this is about actually "interesting" work if you are doing cutting edge research on lets say military or medical applications** you definitely should take things like this seriously.
Obviously you can do LLM's locally if you don't feel like paying up for programmers who likes to code, and who wants to have in-depth knowledge of whatever they are doing.
** https://www.bbc.co.uk/news/articles/c2eeg9gygyno
Of course you should not violate company policy, and some environments will indeed have more stringent controls and measures, but there is a whole world of grey were the CIO has put in place a moratorium on LLM but where some people will quickly crunch out the day's work at home with an AI anyways so they look more productive.
You can of course run consider running your own LLM.
I suppose the problem isn't really the technology itself but rather the quality of the employees. There would've been a lot of people cheating the system before, lets say just by copy pasting or tricking your coworkers into doing the work for you.
However if you are working with something actually interesting, chances are that you're not working with disingenuous grifters and uneducated and lazy backstabbers, so that's less of a concern as well. If you are working on interesting projects hopefully these people would've been filtered out somewhere along the line.