Comment by thatguysaguy
7 months ago
At least part of is is that the capex for LLM training is so high. It used to be that compute was extremely cheap compared to staff, but that's no longer the case for large model training.
7 months ago
At least part of is is that the capex for LLM training is so high. It used to be that compute was extremely cheap compared to staff, but that's no longer the case for large model training.
No comments yet
Contribute on Hacker News ↗