← Back to context

Comment by JCM9

20 days ago

People cared about the OpenAI drama when it looked like they might have some real edge and the future of AI depended on them. Now it’s clear the tech is cool but rapidly converging into a commodity with nobody having any edge that translates into a sustainable business model.

In that reality they can drama all they want now, nobody really cares anymore.

Yes and the open source models + local inference are progressing rapidly. This whole API idea is kind of limited by the fact that you need to RT to a datacenter + trust someone with all your data.

Imagine when OpenAI has their 23&me moment in 2050 and a judge rules all your queries since 2023 are for sale to the highest bidder.

  • It doesn't need to wait until 2050. The queries would be for sale as soon as they stop providing a competitive advantage.

  • Even worse for these LLM-as-a-service companies i that the utility of open source LLMs largely comes down to the customization: you can get a lot of utility by restricting token output, varying temperature, and lightly retraining them for specific applications.

    The use-cases for LLMs seem unexplored beyond basic chatbot stuff.

    • I'm surprised at how little their utility for turning unstructured data into structured data, even with some margin of error, is discussed. It doesn't even take an especially large model to accomplish it, either.

      I would think entire industries could reform around having an LLM as a first pass on data, with software and/or human error checking at significant cost reduction over previous strategies.

      1 reply →

There's more to business than tech. There's more to business than product.

The software behind Facebook as an app wasn't particularly unique, yet it eclipsed the competition. The same could be said for Google. Google didn't even have any real lock-in for years, but it still owned consumer mindshare, which gave it the vast majority of search traffic, which made it one of the most valuable companies in the world.

ChatGPT is in a similar position. The fact of the matter is, the average person knows what ChatGPT is and how to use it. Many hundreds of millions of normal people use ChatGPT weekly, and the number is growing. The same cannot be said of Claude, DeepSeek, Grok, or the various open source models.

And the gap is massive. It's not even close. It's like 400M weekly ChatGPT actives vs 30M monthly Claude actives.

So yes, the average Hacker News contrarian who thinks their tiny bubble represents the entire world might think that "nobody cares," in part because nobody they know cares, and in part because that assessment aligns with their own personal biases and desires.

But anyone who's been paying attention to how internet behemoths grow for the past 30 years certainly still cares about OpenAI.

  • The software behind Facebook as an app wasn't particularly unique, yet it eclipsed the competition. The same could be said for Google.

    I remember the search engines of the time and Google was a quantum leap.

    ChatGPT is even more revolutionary but whatever Google is now, once it was brilliant.

    • It gets more interesting the older I get seeing people speculate about the now vs. early internet without having lived through the 90s internet. My first response is "do you not remember what that was like?!" then I remember, no, in fact they might not have even been born yet.

    • I agree, just saying, ChatGPT was a quantum leap, too. That's why it has all the consumer mindshare.

  • You can't compare Facebook with ChatGPT because the costs per user are in totally different orders of magnitude. One $5/mo VPS can serve the traffic of several hundred thousand Facebook users, while ChatGPT needs an array of GPUs per active user. They can optimize this somewhat, but never as much as Facebook can.

    This means that they're stuck with more expensive monetization plans to cover their free tier loss leader, hence the $200/mo Pro subscription. And once you're charging that kind of price to try to make ends meet, you're ripe for disruption no matter how good your name recognition.

    • "ChatGPT needs an array of GPUs per active user" - nit: you're exaggerating by a few orders of magnitude.

      First, queries from users can be combined and fed into servers in batches so that hundreds of queries can be concurrently served by a single node. Second, people aren't on and asking ChatGPT questions every second of every day. I'd guess the median is more like ~single digit queries per day. Assuming average response length of 100 tokens and throughput of 50 tok/s at batch size 50, that's 25 QPS or 2.1M queries per day, or 420k users served per node at 5 queries per user per day.

      Now, a single 8xH100 node is a lot more expensive than $5/mo, so you're directionally correct there, but I'd wager you can segment your market aggressively and serve heavily distilled/quantized models (small enough to fit onto single commodity GPUs, or even CPUs) to your free tier. Finally, this is subject to Huang's Law, which says every 2 years the cost of the same performance will more than halve.

    • People said similar things about Facebook. "Oh their user growth might be amazing, but they're not making any money, it's not a real business."

      But it turns out that with enough funding, you can prioritize growth over profit for a very long time. And with enough growth, you can raise unlimited funds before you get to that point. And going this route is smart and effective if you want to get to a $1T valuation in under a decade.

      So yeah, ChatGPT's margins might not be as high as Facebook's. But it doesn't really matter at this point, they're in growth mode. What matters is whether or not they'll be able to turn their lead and their mindshare into massive profits eventually, and while we can speculate on that, it's far too early to definitively say the answer is no.

      1 reply →

    • Rather than getting into the nitty gritty details of monetization, when we ask ourselves if OpenAi can nail product like Facebook did (I guess) to become the next tech giant, I think we have to ask whether it's even possible when the tech industry is as established as it is.

      You would think existing megacaps would be all over any new market if there is a profit to be made. Facebook's competition was basically other startups. That said, Google seems to be dropping the ball almost as bad as Yahoo.

      But sure, if there's absolutely no way to make money from consumer AI then that will also make it hard for oai to win the game.

  • > Google didn't even have any real lock-in for years, but it still owned consumer mindshare, which gave it the vast majority of search traffic, which made it one of the most valuable companies in the world.

    This isn't correct at all. Google's search engine was an important stepping stone to the behavior that actually gave them lock-in, which was an aggressive, anti-competitive and generally illegal effort to monopolize the market for online advertising through acquisitions and boxing out competitors.

    It really was only possible because for reasons we decided to completely stop enforcing antitrust laws for a decade or two.

  • 400 million use it for free, you can give away 400 million of anything for free. The question is, how many are willing to pay the monthly fee required to stop OpenAI from bleeding 5billion/year and return the promised trillions to investors.

[flagged]

  • OpenAI is spending $2 for every $1 it earns. It's certainly eating its investors' lunch, but it's not a sustainable business yet and from all accounts doesn't have a clear plan for how to become one.

    Meanwhile, the ZIRP policies that made this kind of non-strategy strategy feasible are gone.

    • I wouldn't worry. Retracting ZIRP policies gave governments 2 choices: reduce spending by ~10% on average (in Europe), or cheat and scheme to bring them back, on a 2 year timer.

      Interest rates raised, but came back down before the 2 years were even up (rate rise started 27 Jul 2022, Started coming back down 12 Jun 2024), governments have been caught cheating, and the number of central bankers replaced has gone up dramatically. Oh, and none of the governments have reduced spending. Literally not a single one. In fact, Germany has agreed to an unprecedented increase in debt financing of their government.

      In other words, ZIRP, even negative rates, are coming back, and a lot sooner than most people think. Your next house, despite everything that's happened, will be more expensive. But I doubt this will save either OpenAI or Tesler.