Comment by returnInfinity 3 days ago Its not unlimited, the compute allocation was one of the reason for the coup at OpenAI 1 comment returnInfinity Reply pbmonster 3 days ago Pretty sure that was scientists competing for 6 month training runs of new 100B+ parameter models, not coders burning through a couple of million tokens.
pbmonster 3 days ago Pretty sure that was scientists competing for 6 month training runs of new 100B+ parameter models, not coders burning through a couple of million tokens.
Pretty sure that was scientists competing for 6 month training runs of new 100B+ parameter models, not coders burning through a couple of million tokens.