Comment by AdminAccount
4 hours ago
I think the only reason stackoverflow still has any activity is because the community choose to ban AI content [1] and so did most of its other networks [2].
Perhaps it will even see a (small) resurgence when AI providers start charging for the actual costs.
Considering StackOverflow is now providing a ground truth for AI training, I believe the ban is more about not poisoning the well rather than keeping the StackOverflow or StackExchange human-friendly.
That ship has sailed long time ago with zealot admins and verbal harassment.
> That ship has sailed long time ago with zealot admins
Where there are certainly strong examples of this, a lot of people mistake enforcing the rules as zealotry. Part of the point of SO was that if things don't change then there is a completed state for SO too - no need to ask duplicate questions like on platforms where a post is less long-lived. Unfortunately people take things like “this is a dup”, “provide more information as we can't help”, “this isn't a complete answer”, and so forth, as deeply personal attacks…
One of the good things about LLMs is that they've drawn off all the simple already-answered questions! Unfortunately the more complex ones, or the ones for new solutions, are also going there so SO and its family of sites is ceasing to grow even in the ways it wants to.
> and verbal harassment.
Again, that did/does happen, but a lot less than some people report it. The most abusive people I've seen on there are those who have been given one of the responses I listed above.
You're absolutely right. The problem was so small that SO only had to make a site-wide survey, make a couple of public statements, big administrative changes and a big campaign to earn hearts and minds back.
Even after that, I still feel sour about the site. Talk about burnt bridges.
Moreover, in 2025, 46% of the survey participants told that they don't feel like part of the community. That number was 44% in 2024 [1], too. Also, 2023 doesn't look better: 45.63% of participants said no [2].
Maybe the survey is rigged. Who knows?
[0]: https://survey.stackoverflow.co/2025/stack-overflow#2-feel-l...
[1]: https://survey.stackoverflow.co/2024/community#2-feel-like-a...
[2]: https://survey.stackoverflow.co/2023/#section-stack-overflow...
People also always bring up the "fake XY problem" thing on SO as a sign of toxicity or whatever, but I’ve had many, many results where it was a XY problem, and the actual problem Y was solved, yet I landed there searching for a solution to X :/
The AI companies aren't so deep in the red when you only look at inference though - they are investing loads in new models in an AI arms race.
So I don't imagine AI is going to go away, especially given that now there are more open source models like Qwen that you can run locally. So even if those American behemoths go bankrupt it will persist.
> The AI companies aren't so deep in the red when you only look at inference though - they are investing loads in new models in an AI arms race.
Depends on how you're looking at it (using speculated numbers for easy math):
1. Having operating costs of $100m on revenue of $10b is very deep in the red, regardless of training costs.
2. Having $90m training costs on $10m revenue means they're just breaking even.
Problem is, we don't know their financials and how it is broken down (they could, of course, clear up the confusion and release some numbers, ut they aren't doing that now); all we know is when they need a new raise to continue operating.
From the raises we can determine what their operating costs are (For example, raised $30m in 2024, then $300m in 2025 is a 10x increase in operating costs because they aren't spending on capex. The training is done on opex).
From their subscriptions (which are all only estimated), we can sorta tell what the revenue is, but that's for subscriptions only which are almost guaranteed to be running at a loss (until recently, anyway). We don't even have estimates on revenue from the PAYG API users. Common sentiment is you'd be a fool to use the PAYG options for anything but trialing the service, but the world is filled with fools, so you never know!
What is interesting is comparing the prices for PAYG on the providers supplying open models vs the PAYG on the closed models - the suppliers providing open models aren't spending on training cost, so the cost to supply tokens on open source models is pretty close to the actual price of running models. This is partially confounded by the fact that many of these will have VC money backing them (they are not bootstrapped), and so will also try to perform landgrabs via subsidised tokens, because their goal is an exit with a buyout, and without an eventual acquisition they will simply fail.
I can't think of many open source model suppliers providing subscriptions, not ones that subsidise the subscription, at any rate.
The first IPO of these SOTA providers is going to be the eye-opener; we'll finally see their financials and we'll see just how much the PAYG was subsidised, and how much the subscriptions were subsidised.
Until then, with a collective industry investment of $800b (last I checked) and a collective revenue of $20b (last I checked), they are most definitely operating in the red for the most common definitions of operating in the red.