Comment by bearjaws
1 year ago
This is why we need to focus on smaller, more efficient models.
A lot of the 13 and 30bn parameter models are actually quite good, maybe not GPT4 but it can run on my M1 MBP without an issue, and a bit faster than GPT4 anyway.
I don't need a chat interface that is amazing at all things at once, I need it to be great at 1 subject at a time.
No comments yet
Contribute on Hacker News ↗