Comment by rvz

3 years ago

He has just admitted that O̶p̶e̶n̶AI.com has partially trained GPT-5 and is already planning to test the 'so-called' useless guardrails around it.

There is no 'revolution' around this. Just 'evolution' with more data and more excessive waste of compute to create another so-called AI black-box sophist with Sam Altman selling both the poison (GPT-4) and the antidote (Worldcoin).

At some point, with their tremendous lock-in strategy, O̶p̶e̶n̶AI.com and Microsoft will eventually use the lock-in to upsell and compete against their partners.

> Sam Altman selling both the poison (GPT-4) and the antidote (Worldcoin).

I actually find it pretty amazing that more people aren't given pause by Sam Altman's involvement here. After the WorldCoin stuff, I'd think that he'd be viewed with a much more skeptical eye in terms of his ethics.

  • The late-night March 31 release of Worldcoin had the (unintended?) side-effect of making me think "a token to prove my personhood" was an April Fools Joke when I saw it the next morning and never thought about it again.

    • There is still plenty of time to ask Ronald Wayne and Steve Wozniak about why Apple Inc. was the worst April Fools joke in history.

      Worldcoin also isn't an April fools joke either and they are quite serious in selling there proof-of-personhood anti-AI snake-oil antidote.

      Why? It's clear that it positions Mr Altman to a point where he cannot lose and is hedged on both sides of where this AI narrative goes.

> has partially trained GPT-5

From my understanding they are training their new GPT models off of a checkpoint from the previous generation, so they technically have partially trained multiple future models in their GPT lineage.

Could you detail how Worldcoin is an antidote to GPT-4 ?

  • Worldcoin price skyrockets => cryptocurrency speculation returns => GPU shortage prevents further training of LLMs

    • GPU shortage ended not because of a crash in crypto, but because Ethereum switched to PoS and ditched GPUs. Other cryptos offer just a fraction of payouts from mining, so it's not profitable to mine them on GPUs unless you have free electricity.

    • Literally none of the LLMs we're talking about were trained on consumer GPUs where the market shortage mattered, they used things like nVidia A100 pods or custom hardware like Google TPU clusters.

      The GPUs which are good for cryptocurrencies are decent for using ML models but are not good for training LLMs and vice versa, as the hardware requirements start to diverge. Training LLMs requires not only high compute power but also lots and lots of memory and extremely high-speed interconnect for large models, which ends up costing far more than the pure compute cryptomining needs, making it not cost-efficient for mining.