← Back to context

Comment by choldstare

11 hours ago

Not really - on prem llm hosting is extremely labor and capital intensive

But can be, and is, done. I work for a bootstrapped startup that hosts a DeepSeek v3 retrain on our own GPUs. We are highly profitable. We're certainly not the only ones in the space, as I'm personally aware of several other startups hosting their own GLM or DeepSeek models.