Comment by nekusar
21 hours ago
Local LLMs.
Krasis is one such tool that allows large models using blended GPU/RAM.
ik_llama for better performance than llama.
ComfyAI for local image generation.
Nanocrab seems better for orchestration. Still need a good system capability firewall.
Who’s buying the memory for this effort?
Think how cheap its gonna be when everyone abandons the cloud providers and they start selling the 50B of hardware they over-invested in
I got 96GB of DDR5 ram 2y ago for $300.
Which now, 32GB goes for $300. Fucking insane. But prices will eventually come down as the enterprise and corpo scalpers realize AI is a losing deal for human replacement. Nvidia has already said as much.
https://fortune.com/2026/04/28/nvidia-executive-cost-of-ai-i...