Comment by KoolKat23

1 year ago

That is indeed cynical haha.

A very simple observation, our brains are vastly more efficient. Obtaining vastly better outcomes from lesser input. This evidence means there's plenty of room for improvement without a need to go looking for more data. Short term gain versus long term gain like you say, shareholder return.

More efficiency means more practical/useful applications and lower cost as opposed to bigger model which means less useful (longer inference times) and higher cost (data synthesis and training cost).

That’s assuming that LLMs act like brains at all.

They don’t.

Especially not with transformers.

  • Says who? At a fundamental level

    • At a fundamental level, brains don’t operate on floating point numbers encoded in bits.

      They have chemicals to facilitate electrochemical reactions which can affect how they respond to input. They don’t throw away all knowledge of what they just said. They change continuously, not just in fixed training loops. They don’t operate in turns.

      I could go on.

      Honestly the number of people who just heard “learning,” “neural networks,” and “memory” and assume that AI must be acting like a biological brain is insane.

      Truly a marvel of marketing.

      5 replies →