← Back to context Comment by lwhi 1 day ago It's because of the Mac Mini's unified memory architecture; which is ideal for inference. 2 comments lwhi Reply ErneX 9 hours ago The amount of ram available on a Mac Mini is not good enough for a decent open model for OpenClaw, everybody is using remote AI services on those. lwhi 6 hours ago You can get up 64GB of memory.It's very difficult to get this much memory on a graphics card.
ErneX 9 hours ago The amount of ram available on a Mac Mini is not good enough for a decent open model for OpenClaw, everybody is using remote AI services on those. lwhi 6 hours ago You can get up 64GB of memory.It's very difficult to get this much memory on a graphics card.
lwhi 6 hours ago You can get up 64GB of memory.It's very difficult to get this much memory on a graphics card.
The amount of ram available on a Mac Mini is not good enough for a decent open model for OpenClaw, everybody is using remote AI services on those.
You can get up 64GB of memory.
It's very difficult to get this much memory on a graphics card.