← Back to context

Comment by mzl

3 months ago

I think you are underestimating the massive amounts of Python code that is built around these things. Also, a lot of businesses are not really interested in using an API for an LLM, instead they will modify and fine-tune their own models and deploy in their own data-centers (virtual or physical), and that means even more Python code.

Sure, a system that only relies on token factory LLM APIs can be written in any language, but that is not the full width and breadth of the AI hype.

> Also, a lot of businesses are not really interested in using an API for an LLM, instead they will modify and fine-tune their own models and deploy in their own data-centers

You realize model training cost millions right? "a lot of businesses" doesn't pass sniff test here.

I'm not even counting the large swaths of data required to train. And the expensive specialists.

And then you'll have to retrain outdated models every so often.

There's a reason that AI has only a handful of players delivering SoTA models and these players are all worth $5B+.

  • SOTA LLM model training costs a lot, yes. But fine-tuning and training of smaller models is a lot cheaper.

    I've trained useful vision-models that delivered business value for industrial applications on a MacBook overnight.