Comment by Ginden
11 hours ago
Because almost everyone involved in AI race grew up in "winner takes it all" environments, typical for software, and they try really hard to make it reality. This means your model should do everything to just take 90% of market share, or at least 90% of specific niche.
The problem is, they can't find the moat, despite searching very hard, whatever you bake into your AI, your competitors will be able to replicate in few months. This is why OpenAI is striking deal with Disney, because copyright provides such moat.
> your competitors will be able to replicate in few months.
Will they really be able to replicate the quality while spending significantly less in compute investment? If not then the moat is still how much capital you can acquire for burning on training?
There are multiple tech companies with quadrillion-deep pockets.
Is that not what distillation is?
What does moat even mean anymore
> copyright provides such a moat.
Been saying this since the 2016 Alice case. Apple jumped into content production in 2017. They saw the long term value of copyright interests.
https://arstechnica.com/information-technology/2017/08/apple...
Alice changed things such that code monkeys algorithms were not patentable (except in some narrow cases where true runtime novelty can be established.) Since the transformers paper, the potential of self authoring content was obvious to those who can afford to think about things rather than hustle all day.
Apple wants to sell AI in an aluminum box while VCs need to prop up data center agrarianism; they need people to believe their server farms are essential.
Not an Apple fanboy but in this case, am rooting for their "your hardware, your model" aspirations.
Altman, Thiel, the VC model of make the serfs tend their server fields, their control of foundation models, is a gross feeling. It comes with the most religious like sense of fealty to political hierarchy and social structure that only exists as hallucination in the dying generations. The 50+ year old crowd cannot generationally churn fast enough.
OpenAIs opsec must be amazing, I had fully expected some version of ChatGPT to be leaked on torrent sites at some point this year. How do you manage to avoid something that could be exfiltrated on a hard disk from escaping your servers in all cases, forever?
The model size is probably the thing here. I suspect they took the FAANG remote workstation approach, where VScode runs on a remote machine. After all its not that great having a desktop with 8 monster GPUs under your desk. (x100)
Plus moving all that data about is expensive. Keeping things in the datacenter is means its faster and easier to secure.
Totally agree, people love to talk about how hopelessly behind Apple is in terms of AI progress when they’re in a better position to compete directly against Nvidia on hardware than anyone else.
Apple's always had great potential. They've struggled to execute on it.
But really, so has everyone else. There's two "races" for AI - creating models, and finding a consumer use case for them. Apple just isn't competing in creating models similar to the likes of OpenAI or Google. They also haven't really done much with using AI technology to deliver 'revolutionary' general purpose user-facing features using LLMs, but neither has anyone else beyond chat bots.
I'm not convinced ChatGPT as a consumer product can sustain current valuations, and everyone is still clamouring to find another way to present this tech to consumers.
1 reply →
My goodness, are you really saying, in effect, "I wish people over 50 would just hurry up and die"?!?
Good lord, expressing that kind of sentiment does not make for a useful and engaging conversation here on hacker news.
Clutch pearls
Striking deals without a proper vision is a waste of resources. And that’s the path OAI is on.
It's also why they bought 40% of the world's RAM supply, too
Committed to buying. They dont have the money to actually buy it (at least not yet).