Comment by mchusma
10 hours ago
Ever since I saw the first one of these one-bit models made by Microsoft, I thought this was a fascinating route. I assume that in practice, this is less helpful than it seems, just because there's every economic incentive in the world for the big AI labs to produce small, powerful, fast models. None of them seem to be using this technique, so it's interesting, but I suspect it's not quite working.
I also have yet to see any of these at a larger scale. For example, can you try one of these at 100 billion parameters?
No comments yet
Contribute on Hacker News ↗