← Back to context

Comment by jasongill

2 days ago

Yeah, I don't see anything new here, I am guessing that OP just chanced across it.

This is not something that is very useful or relevant these days - it's basically abandoned at this point and only works with older versions of Python etc.

Searching around, it appears that the Coral USB Accelerator does about 4 TOPS.

The Raspberry Pi 5 AI Kit which costs a few bucks more (and came out last year instead of in 2020) does 13 TOPS.

The Jetson Orin Nano Super, which costs $250, does 67 TOPS, and was just updated last month (although it's a refresh of the original product).

I own all three of these products and they are all very frustrating to work with, so you need to have a very specific use-case to make them worthwhile - if you don't, just stick with your machine's GPU.

> The Jetson Orin Nano Super, which costs $250, does 67 TOPS, and was just updated last month (although it's a refresh of the original product).

FWIW, there wasn't actually a physical revision/refresh, it's all software. So older owners can just update and get the boost as well.[0]

> With the same hardware architecture, this performance boost is enabled by a new power mode which increases the GPU, memory, and CPU clocks. All previous Jetson Orin Nano Developer Kits can use the new power mode by upgrading to the latest version of JetPack.

[0]: https://developer.nvidia.com/blog/nvidia-jetson-orin-nano-de...

This requires a hat.

See my other comment regarding efficiency of my Intel Xe iGPU.

Jetson is a different league though. These can run even LLMs (tho 16 GB version was overpriced when I bought during covid, so went for 8 GB). Ollama Just Works (tm); now compared to getting Ollama working with ROCm on my 6700 XT however, that was frustrating.

So, object detection with Tensorflow, works well w/these Coral TPUs. However, you can forget even running Whisper.cpp

One nice thing the Coral USB has for it though is it is USB. You can get it to work on practically any machine. Great for demos.

For old version of Python fire up a VM, OCI, use a decent package manager like uv or pipx.

  • Using Ollama/llama.cpp with Vulkan is both much easier, and works across more GPUs, than ROCm. I wish they'd merge the PR that adds it across the board :(

Coral's claim to fame was 2 TOPS/W. But, to everyone's point here, there's literally no news here since its core (Google's Edge TPU) wasn't updated. I wouldn't be surprised if there were newer products that perform better at this point.