Comment by fastball
6 hours ago
It is very ironic that this post comes from "The Privacy Guy", given that the whole point of this model is to run inference on your own device rather than sending queries to the cloud, which is also much less power intensive than sending a query to OpenAI.
No comments yet
Contribute on Hacker News ↗