← Back to context

Comment by dawnerd

6 months ago

Can’t wait for cheap vibe coded electronics to flood Amazon and burn houses down.

Hah, don't need LLMs for that.

Amazon has been hiding behind "it's a marketplace" for more than a decade. There's an insane amount of shit that should never be sold. Including, but not limited to, fake fire alarms sold as real ones. The CPSC tried going after Amazon but are stuck only going after listings once in awhile. I can't imagine the deaths caused by Amazon are only in the single digits.

The traces for the power lines are extremely thin, this device may suffer issues related to that. These devices pull a lot of power when Wifi is on, and too-thin traces aren't going to help that. Depending on the device, those traces might act like a fuse (and go poof), and I can imagine there could be plenty of other issues that could lead to fires from a "vibe coded" PCB.

  • There is no shortage (pun intended) of dangerously bad electronics out there already. If that’s what you want to prevent, finger wagging at ai coding for electronics isn’t going to help - but regulations and certification requirements might

    • Most mass produced electronics aren't vibe-coded hallucinations. There is at least some level of attention to detail in most of it, but of course there's still plenty of dangerous crap out there.

      I don't care about this one example project, but when thousands of people read about it and vibe-code their own hallucinated PCB, hopefully wasting their money is the worst thing that happens. They certainly won't be learning much if the AI does it for them. They also don't get the pride that comes from understanding. They are an imposter, and when someone asks if they made the thing, they will feel like an imposter. Nice job, noob!

      I'm active in the world of amateur LED installations, and practically nobody realizes how easy it is to start a fire with a 500 watt power supply (or several of them connected together in bad ways) for their holiday lightshow. "AI" is not likely to help that and will probably make it worse.

      "AI" is like the blind leading the blind, and it gives people permission to do the stupidest things. Sometimes it's right, but it's a gamble. It's not going to always give the same answer for the same question, and when it "hallucinates", a noob is unlikely to notice.