Comment by thisgetsit

11 hours ago

> copyright provides such a moat.

Been saying this since the 2016 Alice case. Apple jumped into content production in 2017. They saw the long term value of copyright interests.

https://arstechnica.com/information-technology/2017/08/apple...

Alice changed things such that code monkeys algorithms were not patentable (except in some narrow cases where true runtime novelty can be established.) Since the transformers paper, the potential of self authoring content was obvious to those who can afford to think about things rather than hustle all day.

Apple wants to sell AI in an aluminum box while VCs need to prop up data center agrarianism; they need people to believe their server farms are essential.

Not an Apple fanboy but in this case, am rooting for their "your hardware, your model" aspirations.

Altman, Thiel, the VC model of make the serfs tend their server fields, their control of foundation models, is a gross feeling. It comes with the most religious like sense of fealty to political hierarchy and social structure that only exists as hallucination in the dying generations. The 50+ year old crowd cannot generationally churn fast enough.

OpenAIs opsec must be amazing, I had fully expected some version of ChatGPT to be leaked on torrent sites at some point this year. How do you manage to avoid something that could be exfiltrated on a hard disk from escaping your servers in all cases, forever?

  • The model size is probably the thing here. I suspect they took the FAANG remote workstation approach, where VScode runs on a remote machine. After all its not that great having a desktop with 8 monster GPUs under your desk. (x100)

    Plus moving all that data about is expensive. Keeping things in the datacenter is means its faster and easier to secure.

Totally agree, people love to talk about how hopelessly behind Apple is in terms of AI progress when they’re in a better position to compete directly against Nvidia on hardware than anyone else.

  • Apple's always had great potential. They've struggled to execute on it.

    But really, so has everyone else. There's two "races" for AI - creating models, and finding a consumer use case for them. Apple just isn't competing in creating models similar to the likes of OpenAI or Google. They also haven't really done much with using AI technology to deliver 'revolutionary' general purpose user-facing features using LLMs, but neither has anyone else beyond chat bots.

    I'm not convinced ChatGPT as a consumer product can sustain current valuations, and everyone is still clamouring to find another way to present this tech to consumers.

    • I think a major part of it is the shovel selling. Nvidia is selling shovels to OpenAI. OpenAI is selling shovels to endless B2B, Consulting, Accounting, software firms buying into it...

My goodness, are you really saying, in effect, "I wish people over 50 would just hurry up and die"?!?

Good lord, expressing that kind of sentiment does not make for a useful and engaging conversation here on hacker news.