Comment by aazo11
21 hours ago
This is a huge unlock for on-device inference. The download time of larger models makes local inference unusable for non-technical users.
21 hours ago
This is a huge unlock for on-device inference. The download time of larger models makes local inference unusable for non-technical users.
No comments yet
Contribute on Hacker News ↗