Comment by geerlingguy
2 days ago
These were nice early in the TensorFlow evolution, for things like Frigate...
But even CPU inference is both faster and more energy efficient with a modern Arm SBC chip, and things like the Hailo chip are way faster for similar price, if you have an M.2 slot.
I haven't seen a good USB port alternative for edge devices though.
The big problem is Google seems to have let the whole thing stagnate since like 2019. They could have some near little 5/10/20 TOPS NPUs for cheap if they had continued developing this hardware ecosystem :(
>The big problem is Google seems to have let the whole thing stagnate since like 2019.
Google's flightiness strikes again. How they expect developers (and to some degree consumers) to invest in their churning product lines is beyond me. What's the point in buying a Google product when there's a good chance Google will drop software support and any further development in 5 years or less?
At my last day job, https://github.com/google-coral/libedgetpu/issues/26 this was the last nail in the coffin that got us to move away from coral hardware. This was when we were willing to look past even the poor availability of the hardware during the peak chip shortage.
> As per https://www.tensorflow.org/guide/versions , Can we assume that libedgetpu released along with a tflite version is compatible with all the versions of tflite in the same major version?
> Hi, we can't give any guarantee that libedgetpu released along with a tflite version is compatible with all the versions of tflite in the same major version.
Yea, right! stares at my Nest smoke detectors
For a device with a 10 year life, and enough connectivity to be future-proof, Google has handled them poorly.
It's only a few weeks ago that they added support for them in the Home app.
I believe they have a 802.15.4 radio, maybe the x chipset is too old, but it would have been great to get Matter over Thread support for them.
New-comers like Aqara will instead take up that space.
Even OpenVINO on an Intel iGPU is as fast with (I've heard) more accurate detection, and can be done on under 5W with a i3 mobile CPU or similar.
Where I live, electricity costs 45 cents per kWh. What would be a good Arm SBC to run Frigate, assuming I have 4 cameras?
I’m not sure what’s the most optimal for cost/performance, but a Pi 5 8G + Hailo 8 looks like it will be a good option.
* https://www.reddit.com/r/frigate_nvr/s/ncxP1YQDfB
* https://github.com/blakeblackshear/frigate/blob/e773d63c16d9...
I tested frigate on this combination and I must recommend against it. The Raspberry Pi 5 lacks hardware accelerated video encode like it's predecessor the Pi 4. Due to this limitation, a Pi 5 will struggle with more than a few cameras, even with the 26 TOPS Hailo.
2 replies →
I'd also keep an eye on the Rubik Pi 3 as it looks specifically designed to compete with the Pi 5 + AI kit and should provide a faster, cheaper, more efficient option. They're only just starting to ship and no support in Frigate yet, so just something to consider if you're not in a hurry to build a system.
* https://www.rubikpi.ai/
* https://liliputing.com/rubik-pi-is-a-compact-dev-board-with-...
* https://www.cnx-software.com/2025/01/09/qualcomm-qcs6490-rub...
Frigate is surprisingly not that cpu intensive with if you have Coral.
I got repurposed HP G2 SFF desktop with old i5-6500 cpu running proxmox with bunch of VMs and LXC containers including frigate.
I am passing both coral USB through to frigate container for object detection and passing intel's gpu through for video decoding.
With 10 cameras continuously recording, corals inference cpu usage is about 12%, frigate CPU usage is about 5%, although a service called go2rtc which is used by frigate to read the cameras streams and restream them to frigate takes up about 15% of the cpu.
Overall my cpu usage fluctuates below 30% on that entire machine devoted to more than proxmox.
I did run the watt calculation on that machine and it was something reasonable, dont recall it right now
Just get OAK cameras from luxonis.
If you want home surveillance, you can just tie a bunch to ethernet and they'll do on-device AI.
I'm running frigate on an SFF computer I got off eBay for $100 with an i7-8700. It averages around 14 watts. OpenVino for object detection and intel-qsv hardware acceleration preset.
how many cameras are you running with that? Whats your inference speed in frigate metrics?
1 reply →
What are you trying to do, if you don't mind me asking?
I'm looking to upgrade my home surveillance setup, currently running Arlo Pro 2 cameras. They work fine, but I'd prefer higher resolution and to avoid saturating my internet upstream with frequent video uploads.
My needs are pretty much the same as people who buy camera bundles from big box stores. I want reliable motion detection for intruders, deliveries, and visitors, and the ability to watch videos recorded in the past couple of weeks.
Looks like the Hailo m.2 accelerator costs $190 whereas I bought a coral accelerator for $55. So not exactly comparable.
The Hailo-8L hat for the Pi is only ~$80 and has more than 3x the compute power of the Coral USB.
* https://www.pishop.us/product/raspberry-pi-ai-hat-13-tops/
Even the bigger Hailo-8 hat with >6x the compute of the Coral is only $135.
* https://www.pishop.us/product/raspberry-pi-ai-hat-26-tops/
Oh nice thanks for digging that up. I currently run Frigate (along with Home Assistant) on my HP Prodesk 400 with a 8th gen Intel i5 and the Coral USB. I wonder if it would run better with that Hailo-8 on my Pi5 with 4GB.
> I haven't seen a good USB port alternative for edge devices though.
Thunderbolt 4
I assume they meant "an accelerator device that plugs into a USB port"
TB4 is PCIe over a cable. USB is out of the picture after device initialization.
2 replies →