Comment by eitally
3 days ago
Fwiw, I have a masters in operations research as a focus area within an industrial engineering degree, and spent 15 years working in manufacturing systems with a focus on test automation & quality. Traditional SPC/SQC analysis is, and will remain, king -- at least for some time. That can potentially evolve on high-vol/low-mix scenarios that lend themselves more easily to training models on anomaly detection, but especially for complex product manufacturing in high-mix factories that's not the case. It's far better to let your test/quality engineers do their jobs and figure out statistical controls on their own.
Among other reasons, this is largely true because acceptable ranges for different anomaly & defect types can vary significantly for different revs of a single product, or even sub-revs (things that are tied to an ECO but don't result in incrementing the product rev), or -- more crucially -- the line the product is manufactured on. One thing that's notoriously tricky to troubleshoot without being physically onsite is whether a defect is because of a machine, because of a person, or because of faulty piece parts/material.
Understanding and knowing how to apply traditional statistical analysis to these problems -- and also designing useful data structures to store all the data you're collecting -- is far more valuable right now than trying to shoehorn in an AI model to do this work.
No comments yet
Contribute on Hacker News ↗