Comment by runjake
2 days ago
Ollama + M3 Max 36GB Mac. Usually with Python + SQLite3.
The models vary depending on the task. DeepSeek distilled has been a favorite for the past several months.
I use various smaller (~3B) models for simpler tasks.
2 days ago
Ollama + M3 Max 36GB Mac. Usually with Python + SQLite3.
The models vary depending on the task. DeepSeek distilled has been a favorite for the past several months.
I use various smaller (~3B) models for simpler tasks.
No comments yet
Contribute on Hacker News ↗