Comment by themafia
1 day ago
> Removing battery swaps is the last step to deploy UAVs autonomously at scale.
I can't say, as a citizen, that I'm particularly excited about this.
> Autonomous drones can deliver over 20x the inspection coverage for the same cost.
And we have 20x the manpower to review this footage? I wonder if you're just generating a bunch of data that cannot be practically used.
"More coverage" isn't always the best answer. "Better informed coverage" is probably the problem to solve here. Aside from that what is the maintenance interval on those drones? How does that incorporate into this system?
I think this is solving the problem in the wrong direction.
We do semi-automated analysis of imagery of things like utility right-of-ways and it's pretty scalable. We triage the vast majority of images automatically and then surface a small subset to human experts to review, but it's much, much faster than having the experts be in the field, while having high coverage. (Most images are really boring.)
And in most cases of inspection, you need to look at the stuff anyway, so any cost reduction is very welcome. And if you do increase the total coverage volume while reducing human time, you get a double benefit of being able to provide more granular information, either in time or in space, which can often be useful.
(And as a commenter below notes - this all works pretty well with several year old CNNs. We use a limited amount of image-LLM stuff to surface things zero-shot, but a lot of what we end up doing is a very conventional classifier with a lot of engineering work to make it very fast for the experts to see only the important things.)
> And we have 20x the manpower to review this footage?
I was involved with a startup that did inspections of power generation windmills. The computer vision anomaly detection was really good and that was about five years ago. The goal was to have the automated visual inspections route images with suspected anomalies to humans for review and it was working well the last time I heard.
Compared to having a human who needs to rappel down to the blades for a manual inspection, this is a huge productivity and safety boost.
Why would humans be reviewing footage?
I've worked on similar systems for oil & gas that combined hyperspectral imaging and LIDAR. The analysis of data collected by drones was fully automated. It was at least as effective as humans at detecting anomalies (something which was thoroughly verified prior to adoption).
The more thorough coverage, potential issues being detected much earlier, and increased automation greatly reduced the total manpower required. Humans only came into the picture when the drones found a problem that needed mitigation. Humans have long been the bottleneck for finding operational risks and issues before they turn into a headline. The more humans you can remove from that loop the bigger the win.
This was years ago, the tech has only improved.
We have done automated utility inspections for the past few years. Computer vision is not a problem any more. When we were starting we had to annotate thousands of images. Today some use cases require less than a 100. More interesting problems are pre-field planning, sagging, or BLOS compliance
Hey - Avi here, cofounder of Voltair.
We should chat- I'd love to hear more about your computer vision model. My email is avi@voltairlabs.com if you want to shoot me an email.
Gosh I hope my house isn't inspection coverage
[dead]
[dead]