Comment by A_D_E_P_T

20 hours ago

Just about everybody who isn't Nvidia dropped the ball, bigtime.

Intel should have shipped their GPUs with much more VRAM from day one. If they had done this, they'd have carved out a massive niche and much more market share, and it would have been trivially simple to do.

AMD should have improved their tools and software, etc.

Apple should have done as you say.

Google had nigh on a decade to boost TPU production, and they're still somehow behind the curve.

Such a lack of vision. And thus Nvidia is, now quite durably, the most valuable company in the world. Imagine telling that to a time traveler from 2018.

I think for AMD, they were focused on competing against Intel. Remember AMD was almost bankrupt about 15 years ago because of competing against Intel. But the very first GPU use for AI was actually with an ATI/AMD GPU, not an Nvidia one. Everyone thinks Nvidia kicked off the GPU AI craze when Ilya Sutskever cleaned up on AlexNet with an Nvidia GPU back in 2012, or when Andrew Ng and team at Stanford published their "Large Scale Deep Unsupervised Learning using Graphics Processors" in 2009, but in 2004, a couple of Korean researchers were the first to implement neural networks on a GPU, using ATI Radeons: https://www.sciencedirect.com/science/article/abs/pii/S00313...

And as of now I do believe AMD is in the second strongest position in the datacenter space after Nvidia, ahead of even Google.

Why should Apple have done this? It doesn’t fit their business in anyway shape or form. Where does data centre hardware sit relative to electronics / humanities cross roads that is foundational for Apple?

  • > Why should Apple have done this?

    For money, probably.

    Apple is presumably leaving a lot of money on the table by not trying to sell Apple Silicon for AI inference and training. They're the only ones who can attach reasonably large GPUs (M3 Ultra) to very large amounts of cheaper memory (512GB SO-DIMM per GPU). Apple could e.g. sell server SKUs of Mac Studios, heck they can sell M3 Ultra chips on PCIe cards. And they could further develop Apple Silicon in that direction. Presumably they would be seen as a very legit competitor to Nvidia that way, perhaps moreso than Intel and AMD. I'd assume that in the current climate this would be extremely lucrative.

    Now, actually doing this would disrupt Apple's own supply chain as well as force it to spend significant internal resources and cultural change for this kind of product line. There's a good argument to be made it would disproportionally negatively affect its Mac business, so this would be a very risky move.

    But given that AI hardware is likely much higher margin than the Mac business an argument could probably (sadly) be made that it'd be lucrative for them to try it. I personally don't think Apple is inclined to take this kind of risk to jeopardize the Mac, but I'm sure some people at Apple have considered this.

    • I guess I mean for apple to remain as apple, they would not do this due to company culture.

  • Yeah nothing about Apple is server side and imho that's what training is. To be serious about it as a company you have all sorts of other tools (crawlers, etc...) helping with training so it basically has to be in the datacenter at any reasonable scale anyway. And that's just not where Apple lives. We saw with Swift that they couldn't focus on server side enough to make it a serious language there and they've consistently declined to enter that area over the years because it's outside their wheelhouse.

Trust me: If Intel could, it would.

From inside news: They were not breaking even on their existing GPUs. The strategy was to take a loss just to have a presence in the space.

  • Intel could position their cards as strong for certain workloads. They had AV1 support first in market, for example.

> And thus Nvidia is, now quite durably, the most valuable company in the world.

Nvidia is the most valuable company in the world right up until the AI bubble pops. Which, while it's hard to nail down when, is going to happen. I wouldn't call their position durable at all.

  • The crashing and burning of Nvidia stock has been predicted for a while now and keeps not really happening. It’s gone pretty flat and volatile up there around $180 but they keep delivering the results to back it up. I was thinking this week that Apple is really primed to make a killing from people who want to run their LLM on-device coupled with an agent in the next couple of years. We’re a long way off being able to train the models – this is going to need an Nvidia-powered datacentre for the foreseeable future, but the local inference seems absolutely like a market that Apple could capture, gutting all the most premium revenue from Anthropic and OpenAI by selling Macs with a large amount of integrated memory to anyone who wants to give them the money to run their native OpenClaw/agent instead of paying ever-growing monthly bills for tokens.

  • It is definitely a case that they will fall a long way but Nvidia will not fail as a whole. They have a way of maximizing their position relentlessly. CUDA turns out to endlessly put them in amazing positions on things like image recognition, AR, Crypto and now AI.

    For all the faults of them leaning in hard on these things for stock market and personal gains, Nvidia still has some of the best quality products around. That is their saving grace.

    They will not be the world most valuable company once the bubble pops, will probably never get back there again, but they will continue to be a decent enough business. I just want them going back to talking about graphics more than AI again, that will be nice.

  • I might as well say that no, it is not going to happen.

    As handwriting code is rapidly going out of fashion this year, it seems likely AI is coming for most of knowledge work next.

    And who is to say that manual labor is safe for long?