Comment by thelastgallon
1 day ago
Also, Google owns the entire vertical stack, which is what most people need. It can provide an entire spectrum of AI services far cheaper, at scale (and still profitable) via its cloud. Not every company needs to buy the hardware and build models, etc., etc.; what most companies need is an app store of AI offerings they can leverage. Google can offer this with a healthy profit margin, while others will eventually run out of money.
Google's work on Jax, pytorch, tensorflow, and the more general XLA underneath are exactly the kind of anti-moat everyone has been clamoring for.
Anti-moat like commoditizing the compliment?
If they get things like PyTorch to work well without carinng what hardware it is running on, it erodes Nvidia's CUDA moat. Nvidia's chips are excellent, without doubt, but their real moat is the ecosystem around CUDA.
7 replies →
Yes!
Pytorch, Jax, tensorflow are all examples to me of very capable products, that compete very well in ML space.
But more broadly work like XLA and IREE are very interesting toolkits for mapping a huge variety of computation onto many types of hardware. While Pytorch et al are fine example applications, are things you can do, XLA is the Big Tent idea, the toolkit to erode not just specific CUDA use cases, but to allow hardware in general to be more broadly useful.
*complement
They just need to actually make and market a good product though, and they seem to really struggle with this. Maybe on a long enough timeline their advantages will make this one inevitable.
all this vertical integration no wonder Apple and Google have such a tight relationship.