Comment by zozbot234
6 months ago
> it's too slow to be useful with such specs.
Only if you insist on realtime output: if you're OK with posting your question to the model and letting it run overnight (or, for some shorter questions, over your lunch break) it's great. I believe that this use case can fit local-AI especially well.
No comments yet
Contribute on Hacker News ↗