Comment by hadlock
3 days ago
Seems like the LLM landscape is still evolving, and training your own model provides no technical benefit as you can simply buy/lease one, without the overhead of additional eng staffing/datacenter build-out.
I can see a future where LLM research stalls and stagnates, at which point the ROI on building/maintaining their own commodity LLM might become tolerable. Apple has had Siri as a product/feature and they've proven for the better part of a decade that voice assistants are not something they're willing to build a proficiency in. My wife still has an apple iPhone for at least a decade now, and I've heard her use Siri perhaps twice in that time.
And if you wanted to build your own data center right now there’s only so much GPU and RAM to go around, and even all the power generation and cooling manufacturers are booked solid.