Comment by qnleigh
19 hours ago
> when they finally execute on the blindingly obvious strategic play that they are naturally positioned for
What's that? It's not obvious to me, anyway.
19 hours ago
> when they finally execute on the blindingly obvious strategic play that they are naturally positioned for
What's that? It's not obvious to me, anyway.
inference hardware, especially starting with on device ai for the mac. I think they should go as far as making a server chip, but that's less obvious today.
My guess would be local AI. Apple Silicon is uniquely suitable with its shared memory.
Yeah exactly. The MacBook Pro is by far the most capable consumer device for local LLM.
A beefed up NPU could provide a big edge here.
More speculatively, Apple is also one of the few companies positioned to market an ASIC for a specific transformer architecture which they could use for their Siri replacement.
(Google has on-device inference too but their business model depends on them not being privacy-focused and their GTM with Android precludes the tight coordination between OS and hardware that would be required to push SOTA models into hardware. )
I see. It'll be interesting to see how much on-device models take off for consumers, when off-device models will be so much more capable. In the past, the average consumer has typically been happy to trade privacy for better products, but maybe it will be different for llms.
They are well positioned but have a history of screwing up their AI plays, I hope they can get it right.
1 reply →
Embrace the vibe, man