Comment by archagon
1 month ago
It's relatively easy to move to different cloud infrastructure (or host your own) later on down the line.
If you rely on an OpenAI LLM for your business, they can basically do whatever they want to you. Oh, prices went up 10x? What are you gonna do, train your own AI?
Anyone who says it’s relatively easy to go to a different cloud has never led a major migration (I have). That’s kind of part of my day job - cloud consulting.
And if you think it’s hard to move to another LLM you haven’t done a major implementation using an LLM and used LangChain (I have). It abstracts a lot of the work and people can choose which LLM they want to use.
You don’t train your LLM. You use your LLM along with RAG.
I have no direct experience with this, but I’ve read that prices went down by 10x or so in 2024, and it seems that OpenAI has plenty of competition?
https://simonwillison.net/2024/Dec/31/llms-in-2024/#llm-pric...