Comment by thepasch
1 day ago
Article title: “The US is winning the AI Race”
Article content: “The US are capitalizing on AI the best”
A lot of assumptions there that no one can actually verify as true right now. If commercialization into rent-seeking SaaS landscapes is the endgame, then yeah, the US is winning the AI race. If individualization, local LLMs, and consumer hardware are the endgame, China is winning the AI race. If it’s something entirely different - if LLMs are the wall and research is what grants the next breakthrough, or if compute and memory requirements take a dive, or whatever; then we have no idea who’s winning the race because that stuff is mostly happening behind closed doors.
That seems like a lot of rationalization to me. China is pursuing these because they cannot compete on the frontier. Yes, there is a possibility that all that compute is not needed, but it is a rather remote possibility, and there is no doubt that, given the choice, China would be pursuing frontier model building with closed, propietary-only offerings.
All that compute is not needed. We have an existence proof from biology in the form of natural intelligence that much greater efficiency is possible. However, achieving dramatic improvements in compute efficiency will depend on unpredictable scientific breakthroughs. Personally I suspect that an entirely new hardware architecture will be needed, although I don't have any hard evidence to back that up.
>We have an existence proof from biology in the form of natural intelligence that much greater efficiency is possible.
It's only a proof that it's possible with 18+ years of training.
1 reply →
>from biology ... much greater efficiency is possible
Those are much more specialized models with pretty mediocre tokens per second.
3 replies →
I dunno, DeepSeek v4 Pro is rather on par as far as I can tell, maybe not with 5.5 Pro in all areas quite yet, but close.
I think China is thinking more about the application layer on top of models as going to matter more than the models themselves, so they don't need to gatekeep the models as much.
China is competing in value AI because they cannot work at the frontier, but how is this bad at all? It’s like how the USA has the best drones but they are a few million dollars apiece while China has DJI.
If China could work at the frontier, I don’t know, I kind of think they would still be dumping a lot of resources into exploring the value side since they have that culture already in place.
I did not imply it was bad. I implied that competing in value AI is the only option that China-based AI companies have due to limitations in compute.
1 reply →
Forgive me if this is a naive assumption, but wouldn’t large language models be fundamentally different for a language that is largely symbols? Again, my understanding of Mandarin is limited if it exists at all.
All tokens are symbols. All of the frontier models speak Mandarin.
6 replies →
"飞机" and "airplane" aren't fundamentally different in terms of how they're represented to a computer. Especially for an LLM, where tokenization likely turns each of those into a single token.
> China is pursuing these because they cannot compete on the frontier.
? Claude, ChatGPT, etc are heinously expensive for tiny benefits lmao. Local + efficient is clearly the future
> ? Claude, ChatGPT, etc are heinously expensive for tiny benefits lmao
Unfortunately local inference is inefficient, 100s of times more inefficient than cloud. When you answer one request at a time you still have to fetch all active weights into compute units, once every token. When you run a batch of 300, you load it once and compute 300 at a time.
Compared to cloud, local inference is less flexible. You can't scale up 5x or 20x, can't have spikes, and pay for it no matter if you use it or not. But usage factor is very low, like 5%. And to run a decent model your system costs $2000 or more.
AI boosters cling to this notion because it's the only way the massive data center buildouts make any sense at all. I guess you could say the US is winning the frontier AI race. Okay. I'm never going to grant a cloud service access to all the contents of my hard drive, that's just never going to happen, so if you expect me and a lot of people like me who feel similarly to get on this train, you better have a local, lightweight model too or we're not even having a discussion, the answer is just no.
5 replies →
> ? Claude, ChatGPT, etc are heinously expensive for tiny benefits lmao. Local + efficient is clearly the future
Corporate America is where the money is, and corporate America will dictate what products are successful by virtue of spend. Individuals aren't going to be paying $100s or $1000s/month en masse for these models but businesses will be. Being local and efficient isn't that important at this stage but even so as American companies continue to scale and invest they'll be able to make those models more local and efficient if the market wants it. Sort of like how you had a big, giant desktop computer and now you've got a super computer in your phone which is in your pocket. Going straight to "local and efficient" means going straight to being behind because at some point, perhaps now even, the local and efficient model won't be able to keep up.
For some reason people think that they somehow know something that Google or Nvidia or whoever, with hundreds of billions of dollars of real money at stake don't already know and it's both amusing and bizarre to see this play out again and again in off-hand comments like "lol tiny benefits".
You buy an iPhone even though the cheap-o Wal-Mart Android phone for $100 "does the same thing". Except that in this case the Android phone just puts you out of business while those spending big money for "tiny benefits" beat you in the market.
20 replies →
Well China is consistently 6 months behind the frontier labs(possibly because they can they harvest data from released frontier models). If the scaling continues, US will win, but if not then China will win as the models will converge.
The non-release of Mythos tell you the future of that, so long as they can keep the weights from being exfiltrated. Once models become true national security threats, they won't be released in their full form. The hitch-a-ride approach becomes less capable of keeping up.
6 replies →
"AI in the datacenter" and "AI on local consumer hardware" will eventually be two separate niches with entirely different capabilities, at least if scaling laws continue unchanged and there's no near-term inherent limit to AI smarts. The real point of the datacenter is to be able to do datacenter-scale things. But you don't need that kind of vast compute to run even the largest open models today: on prem hardware can do it easily especially if you're OK with a somewhat delayed response.
even without any of that anyone you ask who's used AI to any professional degree will agree US is winning AI race right now. Future, who knows
[dead]
[flagged]
> US deep state
Strange reading that on HN and realizing I'm not on Facebook
Its hard not to see that "deep state": group of elite politicians and super-rich, seize more and more power in American society.
9 replies →
Snowden.
ah yes, the "deep state." The formless, nebulous, rhetorical tool that is always infinitely liquid enough to fit into or over any container necessary that the user can satisfy their immense personal problems disguised as eternal doomerism.
I thought it was just an overdramatic term for the unelected bureaucrats that make up the majority of the government, and who have their own institutional momentum.