← Back to context

Comment by TrackerFF

10 hours ago

When GPT 3.5 was released, it could handle maybe a 500 LOC codebase. Experienced engineers were calling it cute, but zero threat to actual programmers.

Then it became thousands.

Now models can handle and operate on code bases with hundreds of thousands LOC, even low MLOC.

So in just 3.5 years we've gone from LLMs being cute toys, to being powerful enough to actually replace junior engineers. Even if we hit a new AI winter tomorrow, the proverbial damage is already done.

What damage lmao? Let’s see the llm producers raise the price to what is necessary to generate viable returns.

BTW they need to make enough to finance reinvestment internally… so it’s a lot more than you think. When they raise the price firms will then have to do a deep dive analysis on what to do - for they cannot see operating expenses climb incrementally without seeing revenue and costs of operations go in a favourable direction.

It’s easy when prices are lower than they should be.

Your prediction is missing all this detail. So….