Comment by xrd
18 hours ago
The takeaway from these comments are that you can really run local models if you use m-series devices from apple.
But, can you do that if you install Linux on that hardware?
I hate to admit apple hardware is incredible. But, I can't say the same about macos anymore.
Can I run Linux and reap the benefits of m-series chips with local inference?
Or, are there any alternatives where I can use llms on Linux on a laptop?
No comments yet
Contribute on Hacker News ↗