Comment by jsheard
2 months ago
> It seems we will run out of hardware by March?
What happens when an unstoppable force (building everything in Electron because hardware is cheap) meets an immovable object (oh no hardware is expensive now)?
2 months ago
> It seems we will run out of hardware by March?
What happens when an unstoppable force (building everything in Electron because hardware is cheap) meets an immovable object (oh no hardware is expensive now)?
We go back to the demoscene days, being creative with what we have instead of shipping Electron junk.
Inshallah
If the FSM wills
Maybe we need to let go of our auto-scaled 100 pod service mesh for a todo list app, and just deploy it bare metal on 2 servers.
You joke, but I remember seeing a talk by Wunderlist CTO who has pretty much that. Also polyglot company and microservices in random languages. Can't find the talk now, but https://www.infoq.com/news/2014/11/gotober-wunderlist-micros... mentions 60 services at least.
I need to get more ideas for my side project. A todo list app with micro services, but everything in bash scripts. So far it's just 6 services.
https://github.com/andi0b/vibe-todo
I guess we have to get creative again.
I actually think you're right here.
Resource constraints have often helped me come up with stuff that I'm actually proud of.
Absolutely. It's why I find working on a microcontroller with 1KB of memory so much more rewarding than, say, a Raspberry Pi.
consumer RAM is not what's creating shortage. Data centers doesn't run electron to train the model or for inference
Sure, consumer ram isn't causing a shortage, but it's affected by the shortage.
They effectively do. They’re trained by brute forcing 100TB of training data through them, rather than any logical learning technique.
A human doesn’t need 100TB of books to learn the alphabet.
> A human doesn’t need 100TB of books to learn the alphabet.
A human does need 16ish hours per day of audio/video content for several years to learn the alphabet.
9 replies →
Every RAM producer is stopping their consumer grade RAM production to provide ECC-RAM and VRAM now. Micron discontinued and closed down Crucial brand as a whole.
So, getting systems with higher RAM capacity is getting harder (from laptops to smartphones). So, for a couple of years, we need to stop using Electron so much and use what we have efficiently.
Data centers, esp. AI hyperscalers do not care about efficiency for now, because they can suffocate consumer-grade part of the hardware marketplace and get anything and everything they want. When their bubble pops, or the whole capacity ends, they need to learn to be efficient, too.
For reference, a well-optimized cluster runs at ~90% efficiency even though they have thousands of users. AI hyperscalers are not there. Maybe 60% efficient, at most. They waste a lot of resources to keep their momentum.
I have a silent hope that because of this change we all will get ECC ram and that consumer CPUs will get proper support for them.
4 replies →
ECC ram uses the same chips, just an extra one on the dimm
The problem is manufactures are shifting their production capacities towards the more profitable, high-performance "AI" components.
Stop using Electron to save massive amounts of RAM.
its not easy choice if you want mature crossplatform UI framework.
There's plenty of those.
2 replies →
Everything gets more expensive?
2026 will be the year of Rust...
Due to lack of memory leaks which will stop increasing RAM prices?
Because it's more memory efficient than most other languages. So you can achieve the same result with lower RAM requirements.
The efficiency...
https://users.rust-lang.org/t/energy-consumption-in-programm...
5 replies →
That, and also because rust compiler is a very good guardrail & feedback mechanism for AI. I made 3 little tools that I use for myself without knowing how to write a single rust line myself.
1 reply →
[dead]