Comment by Wowfunhappy

6 months ago

> This echoes a lot of the rhetoric around "but how will facebook/twitter/etc make money?" back in the mid 2000s.

The difference is that Facebook costs virtually nothing to run, at least on a per-user basis. (Sure, if you have a billion users, all of those individual rounding errors still add up somewhat.)

By contrast, if you're spending lots of money per user... well look at what happened to MoviePass!

The counterexample here might be Youtube; when it launched, streaming video was really expensive! It still is expensive too, but clearly Google has figured out the economics.

You're either overestimating the cost of inference or underestimating the cost of running a service like Facebook at that scale. Meta's cost of revenue (i.e. just running the service, not R&D, not marketing, not admin, none of that) was about $30B/year in 2024. In the leaked OpenAI financials from last year, their 2024 inference costs were 1/10th of that.

  • But their research costs are extremely high, and without a network effect that revenue is only safe until a better competitor emerges.

    • You're moving the goalposts, given the original complaint was not about research costs but about the marginal cost of serving additional users...

      I guess you'd be surprised to find out that Meta's R&D costs are an order of magnitude higher than OpenAI's training + research costs? ($45B in 2024, vs. about $5B for OpenAI according to the leaked financials.)

      4 replies →

  • You're right, I was underestimating the cost of running Facebook! $30B spent / ~3B users = ~$10 per user per year. I'd thought it would be closer to 10¢.

    Do you know why it's so expensive? I'd thought serving html would be cheaper, particularly at Facebook's scale. Does the $30B include the cost of human content moderators? I also guess Facebook does a lot of video now, do you think that's it?

    Also, even still, $10 per user has got to be an order of magnitude less than what OpenAI is spending on its free users, no?

    • > Do you know why it's so expensive? I'd thought serving html would be cheaper, particularly at Facebook's scale.

      I don't know about Facebook specifically, but in general people underestimate the amount of stuff that needs to happen for a consumer-facing app of that scale. It's not just "serving html".

      There are going to be thousands of teams with job functions to run thousands of services or workflows doing something incredibly obscure but that's necessary for some regulatory, commercial or operational reason. (Yes, moderation would be one of those functions).

      > Also, even still, $10 per user has got to be an order of magnitude less than what OpenAI is spending on its free users, no?

      No. OpenAI's inference costs in 2024 were a few billion (IIRC there are two conflicting reports about the leaked financials, one setting the inference costs at $2B/year, the other at $4B/year). That's the inference costs for both their paid subscription users, API users, and free consumer users. And at the time they were reported to have 500M monthly active users.

      Even if we make the most extreme possible assumptions for all the degrees of freedom (all costs can be assigned to the free users rather than the paid ones, the higher number for total inference spend, monthly users == annual users), the cost per free user would still be at most $8/year.