← Back to context

Comment by dathinab

1 year ago

> is it really that big a deal

yes, at least as long as you constantly develop new AI models

and you still need to run the models, and e.g. for GPT4 that is alone already non trivial (energy cost/compute wise)

through for small LLMs if they are not run too much it might be not that bad

---

Generally I would always look for ulterior motives for any "relevant" public statement Sam Altman makes. As history has shown there often seems to be some (through in that usage "ulterior" has a bit too much of a "bad"/"evil" undertone).

To cut it short he seem to be invested in some Nuclear Fusion company, which is one of the potential ways to "solve" that problem. Another potential way is to use smaller LLMs but smaller LLMs can also be potentially a way how OpenAI loses their dominant position, as there is a much smaller barrier for training them.