George Box (UNC Chapel Hill Professor) – Some aspects of statistical design in quality improvement (1987)


Chapters

00:00:35 Quality Control: Economic, Social, and Research Consequences
00:02:58 The Evolution of Scientific Discovery and Technological Change
00:08:36 Process Improvement Through Data Analysis
00:12:34 Strategies for Optimizing Process and Product Quality
00:24:43 Dispersion Effects from Fractional Designs
00:32:05 Iterative Statistical Investigation in Experimental Design
00:36:50 Sensitivity Analysis of Fractional Factorial Designs

Abstract

Revolution in Quality: How Simple Scientific Methods Are Reshaping Industries

The landscape of global manufacturing and product quality has undergone a significant shift due to the application of simple scientific methods and quality control techniques. This article delves into the economic consequences faced by the U.S. industries losing market share to foreign competition, notably Japanese automakers, who have achieved remarkable success through fewer product returns. Statistical and experimental techniques have been democratized and have accelerated technological change and improved product understanding. These methodologies, from basic quality control charts to advanced fractional factorial designs, have enabled industries to achieve profound quality and efficiency improvements.

The Economic Impact of Quality Revolution

Increased foreign competition, particularly from Japanese automakers, has led to a decline in U.S. market share in various sectors, including automobiles, cameras, and stereo components. This loss underscores the urgent need for American industries to adopt more efficient and quality-centric production methods.

Japanese Success and Its Lessons

Japanese automakers have outperformed their U.S. counterparts in terms of product returns within the first six months, demonstrating a superior approach to quality control and production processes. Studying their success can provide valuable insights for U.S. industries aiming to regain their competitive edge.

Embracing Scientific Methods for Quality Improvement

The core of this industrial revolution lies in applying scientific methods to understand and improve products and processes. This approach involves both informed observation, such as quality control charts, and directed experimentation, which induces informative events akin to Franklin’s kite experiment.

The Role of Simple Techniques in Quality Control

The effectiveness of quality improvement methods is not necessarily tied to complexity. Simple techniques, such as Ishikawa’s “seven tools” for quality control, including Pareto charts and histograms, have proven remarkably effective when used widely within the workforce.

Shifting From Inspection to Building Quality

Dr. Deming’s 14 points emphasize a fundamental shift from merely inspecting products to building quality into the manufacturing process itself, thereby reducing the need for extensive quality control checks.

Advancements in Experimental Design

Fractional factorial designs, initially developed in 1933, have played a pivotal role in efficiently screening many factors in experiments, thus enhancing quality improvement processes. Their effectiveness lies in their symmetric filling of experimental space and projective properties.

The Madison Center’s Contributions

The Center for Quality and Productivity at Madison, established to focus on research related to quality and productivity, has published 23 reports on various aspects of quality and productivity. Their focus on preliminary models paves the way for more sophisticated fitting techniques in the future.

Optimizing Mean and Variance in Processes

Simultaneously optimizing the mean and variance of a process is crucial for achieving a desired average response with minimal variability. However, conducting large experiments with multiple repeat runs presents challenges.

Identifying and Addressing Dispersion Effects

Dispersion effects, the variance in observations due to specific factors, have been a focus in quality control. Research has explored the potential of using fractional designs to study dispersion effects, such as an experiment on welding with nine variables in a 16-run fractional factorial design.

Iterative Experimental Design and Bayesian Analysis

The field of experimental design and analysis is increasingly adopting iterative approaches and Bayesian models. These models incorporate prior assumptions of effect sparsity and enable more nuanced decision-making.

Analyzing Fractional Factorials via Bayes’ Theorem

By analyzing fractional factorials, one can estimate the effects of factors and the level of noise in the data. Parameters alpha and Kappa are used in the analysis, representing the proportion of active effects and the number of effects per factor, respectively. Estimates for alpha and Kappa can be obtained by examining past examples of fractional factorials. The analysis provides probabilities associated with each factor and interaction, indicating their likelihood of being active. A sensitivity analysis can be conducted to assess the impact of varying alpha and Kappa within reasonable ranges. Conclusions drawn from the analysis are generally robust to variations in alpha and Kappa. This method has been applied in numerous examples and has proven useful in analyzing fractional factorials.

Conclusion

This quality revolution, marked by the adoption of simple yet scientifically grounded methods, signals a significant shift in how industries approach production and quality control. Learning from Japanese successes and utilizing innovative methods in quality control and experimental design can help industries worldwide enhance their competitiveness and product quality.


Notes by: Hephaestus