Comment by btown
12 days ago
It would be incredibly ironic if, with Apple's relatively stable supply chain relative to the chaos of the RAM market these days (projected to last for years), Apple compute became known as a cost-effective way to build medium-sized clusters for inference.
It’s gonna suck if all the good Macs get gobbled up by commercial users.
Outside of YouTube influencers, I doubt many home users are buying a 512G RAM Mac Studio.
I doubt many of them are, either.
When the 2019 Mac Pro came out, it was "amazing" how many still photography YouTubers all got launch day deliveries of the same BTO Mac Pro, with exactly the same spec:
18 core CPU, 384GB memory, Vega II Duo GPU and an 8TB SSD.
Or, more likely, Apple worked with them and made sure each of them had this Mac on launch day, while they waited for the model they actually ordered. Because they sure as hell didn't need an $18,000 computer for Lightroom.
2 replies →
I'm neither and have 2. 24/7 async inference against github issues. Free. (once you buy the macs that is)
18 replies →
That product can still steal fab slots from cheaper, more prosumer products.
I did. Admittedly it was for video processing at 8k which uses more than 128gb of ram, but I am NOT a YouTuber.
Of course they're not. Everybody is waiting for next generation that will run LLMs faster to start buying.
1 reply →
it's not like regular people can afford this kind of Apple machine anyway.
It’s just depressing that the “PC in every home” era is being rapidly pulled out from under our feet by all these supply shocks.
35 replies →
It already is depending on your needs.