Comment by jhatemyjob

6 days ago

You raise a valid concern but you presume that we will stay under the OpenAI/Anthropic/etc oligopoly forever. I don't think this is going to be the status quo in the long-term. There is demand for different types of LLMs trained on different data. And there is demand for hardware. For example the new Mac Studio has 512gb VRAM which can run the 600B param Deepseek model locally. So in the future I could see people training their own LLMs to be experts at their language/framework of choice.

Of course you could disagree with my prediction and that these big tech companies are going to build MASSIVE gpu farms the size of the Tesla Gigafactory which can run godlike AI where nobody can compete, but if we get to that point I feel like we will have bigger problems than "AI react code is better than AI solidjs code"

I suspect we’ll plateau at some point and the gigafactories won’t produce a massive advantage. So running your own models could very well be a thing.

  • Yea probably..... I wonder when the plateau is. Is it right around the corner or 10 years from now? Seems like they can just keep growing it forever, based on what Sam Altman is saying. I'm botching the quote but either he or George Hotz said something to the effect of: every time you add an order of magnitude to the size of the data, there is a noticeable qualitative difference in the output. But maybe past a certain size you get diminishing returns. Or maybe it's like Moore's Law where they thought it would just go on forever but it turned out it's extremely difficult to get the distance between two transistors smaller than 7nm