Comment by giantrobot
1 day ago
Apple's neural engine is used a lot by the non-LLM ML tasks all over the system like facial recognition in photos and the like. The point of it isn't to be some beefy AI co-processor but to be a low-power accelerator for background ML workloads.
The same workloads could use the GPU but it's more general purpose and thus uses more power for the same task. The same reason macOS uses hardware acceleration for video codecs and even JPEG, the work could be done on the CPU but cost more in terms of power. Using hardware acceleration helps with the 10+ hour lifetime on the battery.
Yes of course but it's basically a waste of silicon (which is very valuable) imo - you save a handful of watts to do very few tasks. I would be surprised if in the length of my MacBook the NPU has been utilised more than 1% of the time the system is being used.
You still need a GPU regardless if you can do JPEG and h264 decode on the card - for games, animations, etc etc.
Do you use Apple's Photos app? Ever see those generated "memories," or search for photos by facial recognition? Where do you think that processing is being done?
Your macbook's NPU is probably active every moment that your computer is on, and you just didn't know about it.
How often is the device either generating memories or I'm searching for photos? I don't use Apple Photos fwiw, but even if I did I doubt I'd be in that app for 1% of my total computer time, and of that time only a fraction of the time would be spent doing stuff on the ANE. I don't think searching for photos requires that btw, if they are already indexed it's just a vector search.
You can use asitop to see how often it's actually being used.
I'm not saying it's not ever used, I'm saying it's used so infrequently that any (tiny) efficiency gains do not trade off vs running it on the GPU.
2 replies →