Comment by rstuart4133

12 hours ago

> So, I'm only slightly trying to be a smartass here, but... Who is this for?

The primary difference between a Chromebook and a Googlebook appears to be the ability to run LLM's locally.

The requirements were spelt out at Google I/O. They boil down to a 40 TOPS NPU and a minimum of 16GB of memory. They appear to be trying to match Apple's M series memory bandwidth using software compression. ChromeOS didn't need an NPU and specified a minimum of 4GB of memory. Aluminium OS looks to have the same relationship with its LLM as a Chromebook did with Google Chrome, and needs the hardware to power it.

If they pull it off you will get GPT-4 performance, running locally.

As for who this is for: your guess is as good as mine. But if their replacement for crostini works (crostini is so hopelessly unreliable it felt like it never got out of beta, so it's a big if), even the minimum specs would be a very good Linux laptop.

Do you have a source for this local stuff?

i can kinda see it, they spent a lot of time getting Gemma 4 pretty efficient and then seeing everyone buy macs to run them and realize it’s maybe a real moat since Apple doesn’t make any AI

Would be an interesting product if it could actually give you GPT performance locally, will be an awful experience if it’s essentially just cloud AI…like a premium laptop where most of the features are locked behind a subscription would be wild