Comment by segmondy
16 hours ago
probably because maybe 1 or 2 folks on here can run it? It's 1000B model, if 16bit training then you need 2000b of GPU vram to run it. Or about 80 5090s hooked up to the same machine. Or 20 of them to run it in Q2.
16 hours ago
probably because maybe 1 or 2 folks on here can run it? It's 1000B model, if 16bit training then you need 2000b of GPU vram to run it. Or about 80 5090s hooked up to the same machine. Or 20 of them to run it in Q2.
Still more accessible then fully closed models.