Comment by canjobear

5 months ago

Big OpenAI releases usually seem to come with some kind of baked-in controversy, usually around keeping something secret. For example they originally refused to release the weights to GPT-2 because it was "too dangerous" (lol), generating a lot of buzz, right before they went for-profit. For GPT-3 they never released the weights. I wonder if it's an intentional pattern to generate press and plant the idea that their models are scarily powerful.

Absolutely. They have not shown a lot of progress since the original release of gpt4 compared to the rest of the industry. That was March 2023.

How quickly do you think funding would dry up if it was found that gpt5 was incremental? I’m betting they’re putting up a smoke screen to buy time.

No there was legit internal push back about releasing GPT2. The lady on the OpenAI board who led the effort to coup Sam spoke about it in an interview that she and others were part of a group that strongly pushed against it because it was dangerous. But Sam ignored them which started their "Sam isn't listening" thing which built up over time with other grievances.

Don't underestimate the influence of the 'safety' people within OpenAI.

That plus people always invent this excuse that there's some secret money/marketing motive behind everything they don't understand, when reality is usually a lot simpler. These companies just keep things generally mysterious and the public will fill in the blanks with hype.