Comment by y42

6 days ago

> consumer-grade hardware

Not disagreeing per se, but a quick look at the installation instructions confirms what I assumed:

Yeah, you can run a highly quantized version on your 2020 Nvidia GPU. But:

- When inferencing, it occupies your "whole machine.". At least you have a modern interactive heating feature in your flat.

- You need to follow eleven-thousand nerdy steps to get it running; my mum is really looking forward to that.

- Not to mention the pain you went through installing Nvidia drivers; nothing my mum will ever manage in the near future.

... and all this to get something that merely competes with Haiku.

Don't get me wrong - I am exaggerating, I know. It's important to have competition and the opportunity to run "AI" on your own metal. But this reminds me of the early days of smartphones and my old XDA Neo. Sure, it was damn smart, and I remember all those jealous faces because of my "device from the future." But oh boy, it was also a PITA maintaining it.

Here we are now. Running AI locally is a sneak peek into the future. But as long as you need a CS degree and hardware worth a small car to achieve reasonable results, it's far from mainstream. Therefore, "consumer-grade hardware" sounds like a euphemism here.

I like how we nerds are living in our buble celebrating this stuff while 99% of mankind still doomscroll through facebook and laughing at (now AI generated) brain rot.

(No offense (ʘ‿ʘ)╯)