Comment by Workaccount2
1 day ago
Anyone have any idea on the viability of running this on a Pi5 16GB? I have a few fun ideas if this can handle working with images (or even video?) well.
1 day ago
Anyone have any idea on the viability of running this on a Pi5 16GB? I have a few fun ideas if this can handle working with images (or even video?) well.
The 4-bit quant weighs 4.25 GB and then you need space for the rest of the inference process. So, yeah you can definitely run the model on a Pi, you may have to wait some time for results.
https://huggingface.co/unsloth/gemma-3n-E4B-it-GGUF
See here, long story short, this is another in a series of blog posts that would lead you to believe this was viable, but it isn't :/ https://news.ycombinator.com/item?id=44389793