Comment by embedding-shape
21 days ago
> We need models that keep on learning (updating their parameters) forever, online, all the time.
Do we need that? Today's models are already capable in lots of areas. Sure, they don't match up to what the uberhypers are talking up, but technology seldom does. Doesn't mean what's there already cannot be used in a better way, if they could stop jamming it into everything everywhere.
Continuous learningin current models will lead to catastrophic forgetting.
will catastrophic forgetting still occur if a fraction of the update sentences are the original training corpus?
is the real issue actually catastrophic forgetting or overfitting?
nothing prevents users from continuing the learning as they use a model
Catastrophic forgetting is overfitting.
2 replies →