Comment by hn_throwaway_99
5 hours ago
Maybe for right now, but even in the very near future it seems like data center expertise would absolutely be a core competency of any AI leaders.
Heck, look at Facebook. Granted, they got started slightly before AWS, but not by much. Owning all of their own data centers is a huge competitive advantage for them, and unlike most of the other hyperscalers they don't sell compute to other companies (AFAIK).
Again, the commitment is for $100 billion in spend. Building lots of data centers for a lot cheaper than that price should absolutely be doable. Also, geographic distribution isn't nearly as important for AI companies given the way LLMs work. The primary benefit of being close to your data center is reduced latency, but if you think about your average chatbot interface, inference time absolutely swamps latency, so it's not as big a deal. Sure, you'd probably need data centers in different locales for legal reasons, and for general diversification, but, one more time, $100 billion should buy a lot of data centers.
It's interesting that you mention Facebook. They have a ton of their own data centers and yet they are now also spending tens of billions on cloud. It's not that easy to build hundreds of data centers on short notice.