Article ID Journal Published Year Pages File Type
477980 European Journal of Operational Research 2015 10 Pages PDF
Abstract

•We analyze Six Sigma performance for non-normal processes.•We examine changes in failure rates for exponential, Gamma and Weibull processes.•Higher quality improvement effort may be required for non-normal processes.•Reporting the Sigma level as an indication of the quality can be misleading.•Wrong Six Sigma projects can be selected when systematically assuming normality.

Six Sigma is a widely used method to improve processes from various industry sectors. The target failure rate for Six Sigma projects is 3.4 parts per million or 2 parts per billion. In this paper, we show that when a process is exponential, attaining such performances may require a larger reduction in variation (i.e., greater quality-improvement effort). In addition, identifying whether the process data are of non-normal distribution is important to more accurately estimate the effort required to improve the process. A key finding of this study is that, for a low kσ level, the amount of variation reduction required to improve an exponentially distributed process is less than that of a normally distributed process. On the other hand, for a higher kσ level, the reverse scenario is the case. This study also analyzes processes following Gamma and Weibull distributions, and the results further support our concern that simply reporting the Sigma level as an indication of the quality of a product or process can be misleading. Two optimization models are developed to illustrate the effect of underestimating the quality-improvement effort on the optimal solution to minimize cost. In conclusion, the classical and widely used assumption of a normally distributed process may lead to implementation of quality-improvement strategies or the selection of Six Sigma projects that are based on erroneous solutions.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science (General)
Authors
, , ,