Comment by chemotaxis
4 days ago
> I look forward to the "personal computing" period, with small models distributed everywhere...
One could argue that this period was just a brief fluke. Personal computers really took off only in the 1990s, web 2.0 happened in the mid-2000s. Now, for the average person, 95%+ of screen time boils down to using the computer as a dumb terminal to access centralized services "in the cloud".
The personal computing era happened partly because, while there were demands for computing, users' connectivity to the internet were poor or limited and so they couldn't just connect to the mainframe. We now have high speed internet access everywhere - I don't know what would drive the equivalent of the era of personal computing this time.
> We now have high speed internet access everywhere
As I travel a ton, I can confidently tell you, that this is still not true at all, and I’m kinda disappointed that the general rule of optimizing for bad reception died.
> the general rule of optimizing for bad reception died.
Yep, and people will look at you like you have two heads when you suggest that perhaps we should take this into account, because it adds both cost and complexity.
But I am sick to the gills of using software - be that on my laptop or my phone - that craps out constantly when I'm on the train, or in one of the many mobile reception black spots in the areas where I live and work, or because my rural broadband has decided to temporarily give up, because the software wasn't built with unreliable connections in mind.
It's not that bleeding difficult to build an app that stores state locally and can sync with a remote service when connectivity is restored, but companies don't want to make the effort because it's perceived to be a niche issue that only affects a small number of people a small proportion of the time and therefore not worth the extra effort and complexity.
Whereas I'd argue that it affects a decent proportion of people on at least a semi-regular basis so is probably worth the investment.
7 replies →
I work on a local-first app for fun and someone told me I was simply creating problems for myself and I could just be using a server. But I'm in the same boat as you. I regularly don't have good internet and I'm always surprised when people act like an internet connection is a safe assumption. Even every day I go up and down an elevator where I have no internet, I travel regularly, I go to concerts and music festivals, and so on.
I don't even travel that much, and still have trouble. Tethering at the local library or coffee shops is hit or miss, everything slows down during storms, etc.
1 reply →
Yeah British trains are often absolutely awful for this, I started putting music on my phone locally to deal with the abysmal coverage.
Not true because of cost or access? If you consider starlink high speed, it truly is available everywhere.
2 replies →
Privacy. I absolutely will not ever open my personal files to an LLM over the web, and even with my mid-tier M4 Macbook I’m close to a point where I don’t have to. I wonder how much the cat is out of the back for private companies in this regard. I don’t believe the AI companies founded on stealing IP have stopped.
Privacy is a niche concern sadly.
3 replies →
> I don't know what would drive the equivalent of the era of personal computing this time.
Space.
You don't want to wait 3-22 minutes for a ping from Mars.
I'm not sure if the handful of people in space stations are a big enough market to drive such changes.
Privacy, reliable access when not connected to the web, the principal of decentralizing for some. Less supply chain risk for private enterprise.
Centralized only became mainstream when everything started to be offered "for free". When it was buy or pay recurrently more often the choice was to buy.
There are no longer options to buy. Everything is a subscription
3 replies →
I think people have seen enough of this 'free' business model to know the things being sold for free are in fact, not.
1 reply →
> We now have high speed internet access everywhere
This is such a HN comment illustrating how little your average HN knows of the world beyond their tech bubble. Internet everywhere, you might have something of a point. But "high speed internet access everywhere" sounds like "I haven't travelled much in my life".
I don't know, I think you're conflating content streaming with central compute.
Also, is percentage of screentime the relevant metric? We moved TV consumption to the PC, does that take away from PCs?
Many apps moved to the web but that's basically just streamed code to be run in a local VM. Is that a dumb terminal? It's not exactly local compute independent...
Nah, your parent comment has a valid point.
Nearly entirety of the use cases of computers today don't involve running things on a 'personal computer' in any way.
In fact these days, every one kind of agrees as little as hosting a spreadsheet on your computer is a bad idea. Cloud, where everything is backed up is the way to go.
But again, that's conflating web connected or even web required with mainframe compute and it's just not the same.
PC was never 'no web'. No one actually 'counted every screw in their garage' as the PC killer app. It was always the web.
13 replies →
> I don't know, I think you're conflating content streaming with central compute.
Would you classify eg gmail as 'content streaming'?
But gmail is also a relatively complicated app, much of which runs locally on the client device.
7 replies →
Well, app code is streamed, content is streamed. The app code is run locally. Content is pulled periodically.
The mail server is the mail server even for Outlook.
Outlook gives you a way to look through email offline. Gmail apps and even Gmail in Chrome have an offline mode that let you look through email.
It's not easy to call it fully offline, nor a dumb terminal.
3 replies →
> using the computer as a dumb terminal to access centralized services "in the cloud"
Our personal devices are far from thin clients.
Depends on the app, and the personal device. Mobile devices are increasingly thin clients. Of course hardware-wise they are fully capable personal computers, but ridiculous software-imposed limitations make that increasingly difficult.
"Thin" can be interpreted as relative, no?
I think it depends on if you see the browser for content or as a runtime environment.
Maybe it depends on the application architecture...? I.e., a compute-heavy WASM SPA at one end vs a server-rendered website.
Or is it an objective measure?
But that is what they are mostly used for.
On phones, most of the compute is used to render media files and games, and make pretty animated UIs.
The text content of a weather app is trivial compared to the UI.
Same with many web pages.
Desktop apps use local compute, but that's more a limitation of latency and network bandwidth than any fundamental need to keep things local.
Security and privacy also matter to some people. But not to most.
Speak for yourself. Many people don't daily-drive anything more advanced than an iPad.
IPads are incredibly advanced. Though I guess you mean they don't use anything that requires more sophistication from the user (or something like that)?
The Ipad is not a thin client, is it?
5 replies →
I mean, Chromebooks really aren't very far at all from thin clients. But even my monster ROG laptop when it's not gaming is mostly displaying the results of computation that happened elsewhere
There are more PCs and serious home computing setups today than there were back then. There are just way way way more casual computer users.
The people who only use phones and tablets or only use laptops as dumb terminals are not the people who were buying PCs in the 1980s and 1990s, or they were they were not serious users. They were mostly non-computer-users.
Non-computer-users have become casual consumer level computer users because the tech went mainstream, but there's still a massive serious computer user market. I know many people with home labs or even small cloud installations in their basements, but there are about as many of them as serious PC users with top-end PC setups in the late 1980s.
I dislike the view of individuals as passive sufferers of the preferences of big corporations.
You can and people do self-host stuff that big tech wants pushed into the cloud.
You can have a NAS, a private media player, Home Assistant has been making waves in the home automation sphere. Turns out people don't like buying overpriced devices only to have to pay a $20 subscription, and find out their devices don't talk to each other, upload footage inside of their homes to the cloud, and then get bricked once the company selling them goes under and turns of the servers.
This. And the hordes of people reacting with some explanation for why this is. The 'why' is not the point, we already know the 'why'. The point is that you can if you want. Might not be easy, might not be convenient, but that's not the point. No one has to ask someone else for permission to use other tech than big tech.
The explanation of 'why' is not an argument. Big tech is not making it easy != it's impossible. Passive sufferers indeed.
Edit: got a website with an RSS feed somewhere maybe? I would like to follow more people with a point of view like yours.
You can dislike it but it doesn't make it less true and getting truer.
You can likewise host models if you so choose. Still the vast majority of people use online services both for personal computing or for LLMs.
Things are moving this way because it’s convenient and easy and most people today are time poor.
I think it has more to do with the 'common wisdom' dictating that this is the way to do it, as 'we've always done it like this'.
Which might even be true, since cloud based software might offer conveniences that local substitutes don't.
However this is not an inherent property of cloud software, its just some effort needs to go into a local alternative.
That's why I mentioned Home Assistant - a couple years ago, smart home stuff was all the rage, and not only was it expensive, the backend ran in the cloud, and you usually paid a subscription for it.
Nowadays, you can buy a local Home Assistant hub (or make one using a Pi) and have all your stuff only connect to a local server.
The same is true for routers, NAS, media sharing and streaming to TV etc. You do need to get technical a bit, but you don't need to do anything you couldn't figure out by following a 20 minute Youtube video.
I look forward to a possibility where the dumb terminal is less centralized in the cloud, and more how it seems to work in the expanse. They all have hand terminals that seem to automatically interact with the systems and networks of the ship/station/building they're in. Linking up with local resources, and likely having default permissions set to restrict weird behavior.
Not sure it could really work like that IRL, but I haven't put a ton of thought into it. It'd make our always-online devices make a little more sense.
But for a broader definition of "personal computer", the number of computers we have has only continued to skyrocket - phones, watches, cars, TVs, smart speakers, toaster ovens, kids' toys...
I'm with GP - I imagine a future when capable AI models become small and cheap enough to run locally in all kinds of contexts.
https://notes.npilk.com/ten-thousand-agents
Depending on how you are defining AI models, they already do. Think of the $15 security camera that can detect people and objects. That is AI model driven. LLM's are another story, but smaller, less effective ones can and do already run at the edge.
I think that speaks more to the fact that software ate the world, than locality of compute. It's a breadth first, depth last game.
Makes me want to unplug and go back to offline social media. That's a joke. The dominant effect was networked applications getting developed, enabling community, not a shift back to client terminals.
Once up on a time social media was called Usenet and worked offline in a dedicated client with a standard protocol. You only went online to download and send messages, but could then go offline and read them in an app of your choice.
Web2.0 discarded the protocol approach and turned your computer into a thin client that does little more than render webapps that require you to be permanently online.
> Once up on a time social media was called Usenet and worked offline in a dedicated client with a standard protocol.
There was also FidoNet with offline message readers.
> called Usenet and worked offline
People must have been pretty smart back then. They had to know to hang up the phone to check for new messages.
I guess we're in the kim-1 era of local models, or is that already done?
That 'average' is doing a lot of work to obfuscate the landscape. Open source continues to grow (indicating a robust ecosystem of individuals who use their computers for local work) and more importantly, the 'average' looks like it does not necessarily due to a reduction in local use, but to an explosion of users that did not previously exist (mobile first, SAAS customers, etc.)
The thing we do need to be careful about is regulatory capture. We could very well end up with nothing but monolithic centralized systems simply because it's made illegal to distribute, use, and share open models. They hinted quite strongly that they wanted to do this with deepseek.
There may even be a case to be made that at some point in the future, small local models will outperform monoliths - if distributed training becomes cheap enough, or if we find an alternative to backprop that allows models to learn as they infer (like a more developed forward-forward or something like it), we may see models that do better simply because they aren't a large centralized organism behind a walled garden. I'll grant that this is a fairly polyanna take and represents the best possible outcome but it's not outlandishly fantastic - and there is good reason to believe that any system based on a robust decentralized architecture would be more resilient to problems like platform enshittification and overdeveloped censorship.
At the end of the day, it's not important what the 'average' user is doing, so long as there are enough non-average users pushing the ball forward on the important stuff.
We already have monolithic centralised systems.
Most open source development happens on GitHub.
You'd think non-average developers would have noticed their code is now hosted by Microsoft, not the FSF. But perhaps not.
The AI end game is likely some kind of post-Cambrian, post-capitalist soup of evolving distributed compute.
But at the moment there's no conceivable way for local and/or distributed systems to have better performance and more intelligence.
Local computing has latency, bandwidth, and speed/memory limits, and general distributed computing isn't even a thing.
I can't imagine a universe where a small mind with limited computing resources has an advantage against a datacenter mind, no matter the architecture.
The small mind could have an advantage if it is closer or more trustworthy to users.
It only has to be good enough to do what we want. In the extreme, maybe inference becomes cheap enough that we ask “why do I have to wake up the laptop’s antenna?”
1 reply →
I don't want to send sensitive information to a data center, I don't want it to leave my machine/network/what have you. Local models can help in that department.
You could say the same about all self-hosted software, teams with billions of dollars to produce and host SaaS will always have an advantage over smaller, local operations.
Abundant resources could enable bad designs. I could in particular see a lot of commercial drive for huge models that can solve a bazillion different use cases, but aren't efficient for any of them.
There might be also local/global bias strategies. A tiny local model trained on your specific code/document base may be better aligned to match your specific needs than a galaxy scale model. If it only knows about one "User" class, the one in your codebase, it might be less prone to borrowing irrelevant ideas from fifty other systems.
The advantage it might have won't be in the form of "more power", it would be in the form of "not burdened by sponsored content / training or censorship of any kind, and focused on the use-cases most relevant to the individual end user."
We're already very, very close to "smart enough for most stuff". We just need that to also be "tuned for our specific wants and needs".
The only difference is latency.
Universes like ours where the datacentre mind is completely untrustworthy.
Even the most popular games (with few exceptions) present as relatively dumb terminals that need constant connectivity to sync every activity to a mainframe - not necessarily because it's an MMO or multiplayer game, but because it's the industry standard way to ensure fairness. And by fairness, of course, I mean the optimization of enforcing "grindiness" as a mechanism to sell lootboxes and premium subscriptions.
And AI just further normalizes the need for connectivity; cloud models are likely to improve faster than local models, for both technical and business reasons. They've got the premium-subscriptions model down. I shudder to think what happens when OpenAI begins hiring/subsuming-the-knowledge-of "revenue optimization analysts" from the AAA gaming world as a way to boost revenue.
But hey, at least you still need humans, at some level, if your paperclip optimizer is told to find ways to get humans to spend money on "a sense of pride and accomplishment." [0]
We do not live in a utopia.
[0] https://www.guinnessworldrecords.com/world-records/503152-mo... - https://www.reddit.com/r/StarWarsBattlefront/comments/7cff0b...
I imagine there are plenty of indie single-player games that work just fine offline. You lose cloud saves and achievements, but everything else still works.