Comment by lwhi

1 day ago

It's because of the Mac Mini's unified memory architecture; which is ideal for inference.

The amount of ram available on a Mac Mini is not good enough for a decent open model for OpenClaw, everybody is using remote AI services on those.

  • You can get up 64GB of memory.

    It's very difficult to get this much memory on a graphics card.