Statistical Process Control Challenges
In the world of Statistical Process Control there are two things you know for certain. First is that every manufacturing process that has machinery involved is constantly degrading. Therefore, the quality product produced is losing a little bit of quality every day. Secondarily the charting test (examples are WECo and Nelson rules) that is performed to give early warning of unacceptable variation are based on the underlying population exhibiting a normal distribution. We know for certain that no distribution is perfectly normal.
Let’s look at the second case first. The only way that these tests work it is if the population is normally distributed. A professor told me once that another way of looking at this is if you run your tests and then tip the test up on its edge and imagine all the points falling to the bottom does it show a normal distribution? The distribution does not have to be perfectly normal for these tests to work, of course. But, as every process changes over time it is worth asking the question “does the assumption of a normal distribution hold as well?” Lean Business recommends performing normalcy tests periodically.
We know for sure that machinery wears out and that’s what drives the fact that every manufacturing process degrades over time. That is how come Motorola’s Six Sigma project actually measured 4.5 Sigma instead of 6 Sigma’s. Motorola predicted an inherent degradation of process capability of 1.5 Sigma.
There is not much you can do about machinery degradation except Preventative Maintenance and vigilant monitoring that the degradation of process still generates a product with acceptable quality. Have you considered compressing your control limits by 10% to see where your constraint(s) are? There is a constraint somewhere.
Try reading “Theory of Constraints” by Eliyahu Goldratt and you’ll see why SMED (Single Minute Exchange of Dies) is so important.