Comment by robotpepi
5 hours ago
they explain it in the article: this is the first iteration, so they wanted to start with something simple, ie, this is a tech demo.
5 hours ago
they explain it in the article: this is the first iteration, so they wanted to start with something simple, ie, this is a tech demo.
Ok then I look forward to seeing DeepSeek running instantly at the end of April.
Why so negative lol. The speed and very reduced power use of this thing are nothing to be sneezed at. I mean, hardware accelerated LLMs are a huge step forward. But yeah, this is a proof of concept, basically. I wouldn't be surprised if the size factor and the power use go down even more, and that we'll start seeing stuff like this in all kinds of hardware. It's an enabler.