Comment by bigyabai
12 hours ago
Isn't a larger concern that Tim "Services" Cook failed to skate where the puck was headed on this one? 15 years ago the Mac had Nvidia drivers, OpenCL support and a considerable stake in professional HPC. Today's Macs have none of that.
Every business has to make tradeoffs, it's just hard to imagine that any of these decisions were truly worthwhile with the benefit of hindsight. After the botched launch of Vision Pro, Apple has to prove their worth to the wider consumer market again.
Silicon Mac’s are great for running LLMs. Unified memory and memory bandwidth of the Max and Ultra processors is very useful in doing inference locally.
Great news, but entirely lost on commercial hyperscalers and much of the PC market. Apple's recalcitrance towards supporting Nvidia drivers basically killed their last shot at real-world rackmount deployment of Apple Silicon. Now you can go buy an ARM Grace CPU that does the same thing, but cheaper and with better software support.
You really can’t. NVIDIA’s arm chip still looks nerfed compared to apple’s offering, and…I can run 40GB sized LLMs on the plane with no internet…it’s not something that you can do with any other platform.
> Isn't a larger concern that Tim "Services" Cook failed to skate where the puck was headed on this one?
Doesn't somebody (not named Nvidia) need to make a serious profit on AI before we can say that Tim Cook failed?
OpenAI and Anthropic aren't anywhere close. Meta? Google? The only one I can think of might be Microsoft but they still refuse to break out AI revenue and expenses in the earnings reports. That isn't a good sign.
I certainly don't think that profit would be required. Many of the massive tech companies that exist today went through long periods of time were they focused on growth and brand no profits for many years even post IPO.
I won't pretend to know exactly how the AI landscape will look in the future, but at this point it's pretty clear that there's going to be massive revenue going to the sector, and Moore's law will continue to crank.
I see what you're saying though. In particular is first generation gigs data centers might be black holes of an investment, considering in the not too distant future AI compute will be fully commoditized and 10x cheaper.
Their X/OpenGL support has also been in stasis for 10 years or more. There’s not enough money taking over for SGI to move their needle.
Macs are basically a dead business. The key is somehow creating the AI equivalent of an App Store or something
Don't abandon Intel Macs, then and call them Mac AI systems with NVIDIA chips. Sell them for more than the Apple Silicon Macs.