← Back to context

Comment by eikenberry

19 hours ago

Won't the need to train increase as the need for specialized, smaller models increases and we need to train their many variations? Also what about models that continuously learn/(re)train? Seems to me the need for training will only go up in the future.

That's the thing - nobody knows. LLM architecture is constantly evolving and people are trying all kinds of things.