← Back to context

Comment by jeffbee

2 days ago

It had to have been launched longer ago than that because their first public-facing, TPU-using generative product was Inbox Smart Reply, which launched more than 10 years ago. Add to that however much time had to pass up to the point where they had the hardware in production. I think the genesis of the project must have been 12-15 years ago.

The acquired podcast did a nice episode on the history of AI in Google recently going back all the way to when they were trying to do the "I feel lucky", early versions of translate, etc. All of which laid the ground work for adding AI features to Google and running them at Google scale. That started early in the history of Google when they did everything on CPUs still.

The transition to using GPU accelerated algorithms at scale started happening pretty early in Google around 2009/2010 when they started doing stuff with voice and images.

This started with Google just buying a few big GPUs for their R&D and then suddenly appearing as a big customer for NVidia who up to then had no clue that they were going to be an AI company. The internal work on TPUs started around 2013. They deployed the first versions around 2015 and have been iterating on those since then. Interestingly, OpenAI was founded around the same time.

OpenAI has a moat as well in terms of brand recognition and diversified hardware supplier deals and funding. Nvidia is no longer the only game in town and Intel and AMD are in scope as well. Google's TPUs give them a short term advantage but hardware capabilities are becoming a commodity long term. OpenAI and Google need to demonstrate value to end users, not cost optimizations. This is about where the many billions on AI subscription spending is going to go. Google might be catching up, but OpenAI is the clear leader in terms of paid subscriptions.

Google has been chasing different products for the last fifteen years in terms of always trying to catch up with the latest and greatest in terms messaging, social networking, and now AI features. They are doing a lot of copycat products; not a lot of original ones. It's not a safe bet that this will go differently for them this time.

  • but cost is critical. It's been proven customers are willing to pay +- 20/month, no matter how much underlying cost there is to the provider.

    Google is almost an order of magnitude cheaper to serve GenAI compared to ChatGPT. Long term, this will be a big competitive advantage to them. Look at their very generous free tier compared to others. And the products are not subpar, they do compete on quality. OpenAI had the early mover advantage, but it's clear the crowd who is willing to pay for these services, is not very sticky and churn is really high when a new model is release, it's one of the more competitive markets.

    • I don't even know if it amounts to $20. If you already pay for Google One the marginal cost isn't that much. And if you are all in on Google stuff like Fi, or Pixel phones, YouTube Premium, you get a big discount on the recurring costs.