Definitions

Analyze Phase – The third phase of DMAIC, where the top two or three causes of variation are isolated and proven using statistical methods. See Analyze Overview for more information.

Attribute Data – Data that is assigned a limited number of possible values or categories, such as “acceptable” or “unacceptable.” See discrete data below.

Baseline Capability – The historical sigma level of a process, captured in the Measure Phase.

Confirmation Run – A planned production run (or transactional process implementation) that is set up with critical inputs (x’s) set at their optimal values (derived through designed experiments, regression analysis, and other tools). One or more process outputs (y’s, also known as CTQ’s) are measured to confirm the improved process performance.

Continuous Data – Data that can take on a continuous range of values, such as the diameter of a machined part or the temperature readings taken from a baking process.

Control Phase -The last phase of DMAIC, where process controls are implemented with the goal of sustained process capability over the long run. See Control Overview for more background.

Cost of Quality – the sum of the Prevention, Appraisal (inspection), Internal Failure (scrap, rework, etc), and External Failure (warranty claims, lost sales, etc.) costs for a given process. As an organization shifts its resources to Prevention activities, the total cost of quality is drastically reduced. Think about the parallels with personal health…

Critical-to-Quality Characteristic (CTQ) – A characteristic of a product or service which fulfills a critical customer requirement. CTQ’s are the basic elements to be used in driving process measurement, improvement, and control.

Define Phase – The first phase of DMAIC, where business opportunities are reviewed and projects are clearly defined. See Define Overview for more background.

Design of Experiments (DOE) – Pre-planned experiments designed to quantify the effects of process inputs (x’s) on process outputs (y’s). Since DOE’s can be costly to perform, they are typically utilized after the critical process inputs are isolated using other techniques such as multi-vari.

Discrete Data – Data that can take on only a few possible values, such as “good” or “bad,” or the number of times that a particular event occurs, like the number of patients that spend more than one hour in a waiting-room each day.

DMAIC – The proven Six Sigma approach to process improvement: Define, Measure, Analyze, Improve, Control. Also see DMAIC in everyday life.

Dot Plot – A simple, single-axis plot that shows a dot for each data point (similar to a histogram, without the bars). This is an effective tool for quickly reviewing how data points are distributed (where the data points are grouped, where the outliers are located, etc.), and also for visually comparing two or more data sets.

DPM – Defects Per Million

DPMODefects Per Million Opportunities

DPODefects Per Opportunity

DPUDefects Per Unit

Five Whys – Ask “Why?” five times, and you will often arrive at a root cause. Example: Car will not start. Why? Engine seized. Why? Oil level was too low. Why? Did not add oil. Why? Did not know that oil level should be monitored. Why? Lack of training in basic car maintenance. In many cases, the root cause is arrived at before the 5th Why. Here is more on the Five Whys from Wikipedia.

Error Proofing – Automatically prevents a process step from taking place if an error condition exists – error proofing prevents defects from being created in the first place. Example: most manual-transmission cars will not start unless the clutch is depressed, preventing the “defect” of the car jumping forward if in gear. The next-best approach for zero defects is mistake proofing, which prevents an already-generated defect from moving forward in the process.

Fishbone Diagram – Also known as Ishikawa Diagrams, fishbone diagrams are a graphical method for grouping and identifying possible causes or variation sources. Fishbone diagrams are commonly used in the Analyze phase of a Six Sigma project. Here is more on Fishbone Diagrams from Wikipedia.

Gage R&RGage Repeatability and Reproducibility – the amount of variation introduced by a measurement sytem.

Histogram – A graphical method for visualizing data. See Histograms for more background.

Hypothesis Testing – Determines if the observed differences between two or more sets of data are statistically significant, or more likely due to random chance. A hypothesis tests always begins by stating a Null Hypothesis, which assumes that there is no difference between the sample parameters being compared (mean vs. mean, standard deviation vs. standard deviation, etc.). Statistical methods are then used to estimate the probability that the observed differences in the sample parameters could have come from the same population (this probability is known as the p-value). If the p-value is low (typically 5% or lower), then the null hypothesis is rejected and the samples are considered likely to have come from different populations.

Improve Phase – The fourth phase of DMAIC, where the critical process inputs are optimized to produce the desired process output. See Improve Overview for more background.

Long-Term Data – Data collected over an extended time period that likely encompasses all sources of variation present over the long run.

Measure Phase – The second phase of DMAIC, where the current process is documented, the measurement system evaluated, and the baseline process performance measured. See Measure Overview for more background.

Mistake Proofing – Automatically prevents an existing defect from moving forward in the process. Example: A ball bearing is pressed onto a shaft with the dust seal upside-down, but the next operation detects the condition using a vision system, preventing the defective sub-assembly from being pressed into a gear case.

Multi-Vari Charts – An excellent graphical tool for understanding the major components of variation early in a Six Sigma project. Variation categories can include within-part, fixture to fixture, part-to-part, and time-to-time. Multi-vari also lends itself to non-manufacturing operations. We highly recommend Keki Bhote’s book, World Class Quality, which details the use of this tool.

Normal Data – Data that forms a bell-shaped, symmetrical histogram – indicative of multiple, naturally-occurring random events. The 3.4 DPM defect rate for Six Sigma processes assumes that the underlying distribution is normally distributed.

Normal Probability Curve – A curve that shows the theoretical shape of a normally-distributed histogram. The shape of the normal probability curve is based on two parameters: mean (average) and standard deviation (sigma). See normal probability curve.

Pareto Principle – Known as the 80/20 rule.  This page shows how to make a Pareto chart.

PFMEA – Process Failure Modes and Effects Analysis. A structured approach for identifying potential failure modes and associated risk levels for each step in a process. Potential failure modes are rated by multiplying three factors (Severity x Occurrence x Detection) to produce a Risk Priority Number (RPN). All RPN’s are then sorted to prioritize preventive actions (countermeasures).

Process Control Plan – A comprehensive document that specifies all process controls necessary to build a good product or deliver a successful transaction. Process controls can include error proofing (and verification of error proofing devices), mistake proofing (and verification of mistake proofing devices), SPC, and periodic inspections. Compliance with the PCP is an important element in internal quality system audits.

Process Flowchart – A graphical means for describing a process. See process flowcharts for more information.

Regression – Regression analysis attempts to build an equation relating one or more process inputs (x’s) to a given process output (y). Regression analysis is typically used for further defining the relationship between x’s and y’s, after the significant x’s have been identified through designed experiments, hypothesis testing, and other methods.

Short-Term Data – Data that represents a snap-shot of the process but does not likely include all sources of variation that might be present over the long run. Six Sigma uses the 1.5-sigma mean shift to translate short-term data into long-term defect rates.

Sigma Level – A measure of process capability: the higher the sigma-level, the more capable the process is. A Six Sigma process has a short-term sigma-level of 6, and a long-term sigma level of 4.5 (see why not 4.5 sigma?). The theoretical defect rate for a Six Sigma process is 3.4 defects per million (DPM). Simply put, the sigma-level indicates how many standard deviations (“Sigmas”) can fit inside the gap between the process average and the nearest specification limit. See Sigma Level Estimate for more background.

Six Sigma – Six Sigma is two things: (1) a process improvement methodology known as DMAIC, and (2) a statistical statement about process capability: Six Sigma processes have a theoretical defect rate of 3.4 dpm (defects per million) over the long run, which is essentially zero dpm when the Control phase is implemented properly.

SPC – Statistical Process Control – Process and/or product monitoring that uses statistical theory to detect abnormal conditions. SPC falls behind error proofing and mistake proofing as a reliable control method, but is considered an improvement over inspection alone. A number of control charting formats are available, and pre-control is the most effective that we have observed.