Comment by portaouflop 4 months ago If I have let’s say 40gb RAM does it not work at all or just take twice as long to train? 4 comments portaouflop Reply typpilol 4 months ago Won't work at all. Or if it does it'll be so slow since it'll have to go to the disk for every single calculation so it won't ever finish. karpathy 4 months ago It will work great with 40GB GPU, probably a bit less than twice slower. These are micro models of a few B param at most and fit easily during both training and inference. utopcell 4 months ago How low can this go? Can this run on a 5090 card (32GiB)? 1 reply →
typpilol 4 months ago Won't work at all. Or if it does it'll be so slow since it'll have to go to the disk for every single calculation so it won't ever finish. karpathy 4 months ago It will work great with 40GB GPU, probably a bit less than twice slower. These are micro models of a few B param at most and fit easily during both training and inference. utopcell 4 months ago How low can this go? Can this run on a 5090 card (32GiB)? 1 reply →
karpathy 4 months ago It will work great with 40GB GPU, probably a bit less than twice slower. These are micro models of a few B param at most and fit easily during both training and inference. utopcell 4 months ago How low can this go? Can this run on a 5090 card (32GiB)? 1 reply →
Won't work at all. Or if it does it'll be so slow since it'll have to go to the disk for every single calculation so it won't ever finish.
It will work great with 40GB GPU, probably a bit less than twice slower. These are micro models of a few B param at most and fit easily during both training and inference.
How low can this go? Can this run on a 5090 card (32GiB)?
1 reply →