Comment by applfanboysbgon
8 hours ago
> Excepts it comes with a terrible experience that's not sustainable for any serious day-to-day work that doesn't involve constant coffee breaks to wait for some tokens to get generated.
I think you may have misinterpreted what I was saying to be a reference to local models? I am not talking about local. You cannot run DeepSeek on consumer hardware, despite a bunch of people conflating "some 30b model trained on DeepSeek outputs == DeepSeek". But businesses can purchase fleets of GPUs capable of serving DeepSeek for an investment measured in millions rather than billions, and offer something 85% as good as Claude to customers while actually profiting on inference with a $20 subscription, without the massive overhead of training frontier models from scratch.
> (even if someone else is subsidizing it... for now)
That they are giving away something they cannot sustain is the literal entire point of my comment.
No comments yet
Contribute on Hacker News ↗