← Back to context Comment by returnInfinity 2 days ago Its not unlimited, the compute allocation was one of the reason for the coup at OpenAI 1 comment returnInfinity Reply pbmonster 2 days ago Pretty sure that was scientists competing for 6 month training runs of new 100B+ parameter models, not coders burning through a couple of million tokens.
pbmonster 2 days ago Pretty sure that was scientists competing for 6 month training runs of new 100B+ parameter models, not coders burning through a couple of million tokens.
Pretty sure that was scientists competing for 6 month training runs of new 100B+ parameter models, not coders burning through a couple of million tokens.