Comment by g947o
2 months ago
I can imagine Apple shipping Mac Pros with add-ons that allows running local inference with minimal setups. "Look, just spend $50k on this machine and you get a usable LLM server that can be shared for a team." But they don't seem particularly interested in that market.
No comments yet
Contribute on Hacker News ↗