Comment by unleaded 6 months ago ITT nobody remembers gpt2 anymore and that makes me sad 1 comment unleaded Reply GaggiX 6 months ago This model was trained on 6T tokens and has 256k embeddings, quite different than a gpt2 model comparable in size.
GaggiX 6 months ago This model was trained on 6T tokens and has 256k embeddings, quite different than a gpt2 model comparable in size.
This model was trained on 6T tokens and has 256k embeddings, quite different than a gpt2 model comparable in size.