← Back to context

Comment by mattgreenrocks

2 days ago

Underscoring the parent comment and adding to it: watching technologists on a site called Hacker News cheer on the centralization of power is really something.

I don't think any power is as centralized as Google is to search about 10 years ago? Or Facebook is to social media in the same time frame? What has changed other than the players?

  • The dynamics. Discovery benefits all parties, and the middle man can take a cut in several ways (Google chose ads). The middleman never had to open up but that tube spread value instead of extracting it (at least, until they started renting seeking with the tube).

    Being the one stop knowledge hubs that sucks from everyone else only benefits the leech long term.

  • Google still offered a path for business/individuals that allowed both sides to profit immensely via advertising. Google also guided people to sources of information once you look past the ads.

    With the AI companies, they suck up all freely available and proprietary information, hide the sources, and give information away to consumers for mostly free.

Last 3 years of discourse in a nutshell. Sinclair's quote rings true once again... Just a shame people don't think of the long term cost to this trend chasing.

But then again, it wouldn't be a trend if people thought long term, would it?

I think this phase of centralising power is part of the never-ending cycle of centralisation and distribution - mainframes -> PCs -> websites -> apps, and so on round we go. We will get a "data centres -> Personal LLMs" phase of the cycle which distributes it again.

So my hope is that LLMs become local in a few years.

We've been sitting around 16Gb of RAM on a laptop for 10-15 years now, not because RAM is too expensive or difficult to make, but because there's been no need for more than that for the average user. We could get "normal" laptop RAM up to 16Tb in a few years if there was commercial demand for it.

We have processor architectures that are suitable for running LLMS better/faster/efficiently. We could include those in a standard laptop if there was commercial demand for it.

Tokens are getting cheaper, dramatically, and will continue to do so. But we have an upper limit on LLM training complexity (we only have so much Internet data to train them on). Eventually the race between LLM complexity and processing speed will run out, and probably with processing speed as the winner.

So my hope is that our laptops change, that they include a personally-adapted very capable LLM, run locally, and that we start to see a huge variety of LLMs available. I guess the closest analogy would be the OS's from "Her"; less typing, more talking, and something that is personalised, appearing to actually know the user, and run locally (which is important).

I don't see anything stopping Linux from doing this too (but I'm not working in this area so I can't say for sure).

Obviously we'll face the usual data thieves and surveillance capitalism along the way, but that's part of the process.