Comment by fancyfredbot
3 days ago
Google may well want to run more of their content through an LLM, but they will not be using Nvidia hardware to do it, they'll be using their TPUs.
Amazon are on their third generation of in-house AI chips and Anthropic will be using those chips to train the next generation of Claude.
In other words, their biggest customers are looking for cheaper alternatives and are already succeeding in finding them.
Google and Amazon still have to buy tons of Nvidia HW to provide in their clouds. No one writes to their custom chips besides internal teams because the software stack doesn't exist.