Sunteți pe pagina 1din 330

SIX SIGMA GREEN

BELT TRAINING
SIX SIGMA OVERVIEW
AND EVOLUTION
What is Six Sigma
A customer focused business
improvement process
Driven by teamwork, consensus
& logical reasoning
Structured methodology DMAIC
Focuses on making the process
robust & reduce variations

Applies to any Process


What is Six Sigma
Six sigma is a highly disciplined and
quantitative strategic business
improvement approach that seeks to
increase both customer satisfaction
and an organizations financial health.
Six Sigma helps a company focus on
developing and delivering near-
perfect products (durable goods or
services), to improve customer
satisfaction and the bottom line.
What Six Sigma is NOT
Six Sigma is NOT
Just about A Quality
Statistics Program

Used when
Only for
solution is
Technical People
known

Used for Cure for World


Firefighting Hunger
Six-Sigma A note from
Originator of Six Sigma
Six Sigma is not an
improvement program. It is
instead a business philosophy
that employs a step by step
approach to reducing variation,
increasing quality, customer
satisfaction, and in time, market
share
Overview of Six Sigma

SIX SIGMA AS A CULT


PHILOSOPHY
URAL
CHA
NGE
TRANSFORM THE
ORGANIZATION
SIX SIGMA AS A
PROCESS
GROWTH

REDUCE COSTS
SIX SIGMA
AS A
STATISTICA
L TOOL

PAIN, URGENCY, SURVIVAL


What is Six Sigma?
Sigma is a measurement that indicates
how a process is performing
Six sigma stands for Six Standard
Deviations (Sigma is the Greek letter used
to represent standard deviation in
statistics) from mean. Six Sigma
methodology provides the techniques and
tools to improve the capability and reduce
the defects in any process.
Six Sigma is structured application of tools
and techniques applied on project basis to
achieve sustained strategic results.
What is Six Sigma
A Vision of a Six Sigma Company
Organizational
Issue Traditional
Six Sigma
approach
Approach
Problem
Resolution Fixing
Fixing
Preventing
Behavior (symptoms)
(symptoms)
(causes)
Decision Making Reactive
Reactive
Data-based
Data-based
Process Experience-
Experience-
Controlling
Adjustment based
based
Capability
Supplier Tweaking
Tweaking
Long-term
Relationship Cost
Cost (piece
(piece
Producibility
Planning price)
price)
Mandated
Design Short-term
Short-term
Empowered
Employee Performance
Performance
Teams
Training If
If Time
Time Permits
Permits
Benchmarking
Chain-of- Hierarchy
Hierarchy
and metrics
command Seat-of-pants
Seat-of-pants
Asset
Direction Cost
Cost
Manpower
Character of 6s
Traditional Quality / Six Sigma Quality
Method
ISSUE TRADITIONAL SIX SIGMA
APPROACH APPROACH
Index % (Defect Rate)
Data Discrete Data Discrete +
Continuous
Target Satisfaction for Data
Mfg. Process Customer
Range Spec Outliner Satisfaction
Variation
Method Experience + improvement
Job Experience +
Action Job + Statistical
Bottom Up Ability
Top Down
Aligning The Focus
Strategic
Tactical Direction
Direction

Work
Group

Individual
(Traditional)
(Lets do it) (Future)
1000
Unassigne
Six Sigma Journey
Six
d Projects Sigma
Started
Project
What is Six Sigma
Definition

6
Sigma
Sigma
5
4 Sigma
Sigma
Sigma
3 Sigma

Sigma
Sigma
2
Sigma
Sigma

Sigma
% Yield Defects/ Million Opportunities
Level

2 69.1 308,537
3 93.3 66,807
4 99.4 6,210
5 99.98 233
6 99.9997 3.4
Six Sigma : The Statistical
Way
Process of Target Excessive Variation
Target Target

LSL USL LSL USL

Cente Reduc
r Target e
Proce Sprea
ss d

Customers feel the


LSL USL
variation more
Reduce Variation % Center Process than the mean
Six Sigma Practical
Meaning
99.99966% Good (6
99% Good (3.8 Sigma)
20,000 lost articles Sigma)
Seven articles lost
of mail per hour
per hour
Unsafe drinking
One unsafe minute
water for almost 15
every seven months
minutes each day
5,000 incorrect
1.7 incorrect
surgical operations
operations per week
per week One short or long
Two short or long
landing every five
landings at most
years
major airports each
day 68 wrong
200,00 wrong drug
prescriptions per
prescriptions each
year
year One hour without
No electricity for
electricity every 34
almost seven hours
years
Philosophy of Six Sigma
Know Whats Important to the
Customer (CTQ)

Reduce Defects (DPMO)

Center Around Target (Mean)

Reduce Variation ()
Harvesting the fruit of Six
Sigma
HISTORY OF SIX
SIGMA
History of Six Sigma
Quality tools like SPC, Cost of Quality, Control
Charts, Process capability etc. are known to
industry for long time, much before birth of Six
Sigma.
Quality Tools and Quality system
implementation was not in conjunction with
overall business Goals.
Traditional Quality Tools have limitations to
orient the efforts on Quality Improvements to
the Organizational direction basically due to
approach.
Motorola was the first Company to initiate the
Six Sigma breakthrough Strategy.
A Little Bit Of History
Six Sigma was developed by Bill Smith, QM at
Motorola
Its implementation began at Motorola n 1987
It allowed Motorola to win the first Baldrige Award in
1988
Motorola recorded more than $16 Billion savings as
a result of Six Sigma
Several of the major companies in the world have
adopted Six Sigma since then.
Texas Instruments, Asea Brown Boveri, AlliedSignal,
General Electric, Bombardier, Nokia Mobile Phones,
Lockheed Martin, Sony, Polaroid, Dupont, American
Express, Ford Motor,..
The Six Sigma Breakthrough Strategy has become a
Competitive Tool
Motorola Case Study
In early 1980 Motorola was facing a
serious competitive challenge from
Japanese Companies.
Motorola was losing the market share
and customer confidence.
Motorola had not done any major
changes to their products.
The competitors from Japan were
offering much better product at much
lower price with no field failures.
Motorola Case Study.Continue
When Motorola studies the competitors products, it
was revealed that the variation in key product
characteristics is very low.
The competitors products were available at lower
price.
The competitors products has very low warranty
failure rate.
Motorola was not able to match the competitors
price mainly due to high cost of Poor quality largely
due to high reject rate, high rework / repair rate,
high inspection cost, high warranty failure rate etc.
THE TECHNICAL TEAM CONCLUDED THAT THE
COPETITORS ARE OFFERING BETTER PRODUCT
AT LOWER COST.
Motorola Case Study
Motorola requested to the
competitors from Japan to permit
the Team from Motorola to visit
them fro Study.
Motorola sent the team of
managers to Japan to study the
Magic of Japanese companies.
What the team revealed?
Motorola Case Study
What Motorola learning was as follows:
Motorola was focusing too much on product
Quality i.e. Inspection, rework, repair etc.
The internal defect rate was very high inside
Motorola.
The reliability was slow since some of the
defects were passing on to the customer as
inspection lapses.
A dissatisfied customer was shouting loudly and
was taking away min 10 potential customers.
As an effect of this, customers were lost to the
competitors.
Motorola Case Study
What was wrong?
Japanese were concentrating on
Customers
Processes
People
Variation in product and process parameters was known
and controlled
All people were well trained and highly motivated
All activities and processes were highly standardized i.e.
no person dependence
Defect free lines and robust processes
Very less inspectors
Yet, very low defect rate, internal rejection and customer
complaints
VERY HIGH LEVEL OF CUSTOMER SATISFACTION
Motorola Case Study
WHAT WAS THE SECRET?

THE SECRET WAS CONTROL OVER VARIATION

Success factor:
Proactive Vs. Reactive Quality
The Impact Of Added
Inspection
If the likelihood of detecting the
defect is 70% and we have 10
consecutive inspectors with this
level of capability, we would expect
about 6 escaping defects out of
100,000 every 1,000,000 products produced
ppm

6 ppm

3.4
ppm

You can save yourself by producing quality not by


Motorola Case Study
In order to address these issues, Motorola
devised the Six Sigma methodology.
Dr. Mikel Harry and Mr. Bill Smith were
pioneers in Developing and implementing
the Six Sigma methodology at Motorola.
With implementation of Six Sigma,
Motorola could achieve:
4 level in one and half year time
5 level in following year
6 level in following year
Six Sigma Progress

Allied Signal Johnson &


Johnson, Ford,
General Electric Nissan,
Motorola
Honeywell

1985 1987 1992 1995 2002

Dr Mikel J Harry
wrote a Paper
relating early
failures to
quality
What can it do?
Motorola:
5-Fold growth in Sales
Profits climbing by 20% pa
Cumulative savings of $14 billion over 11 years
General Electric
$2 billion savings in just 3 years
The no. 1company in the USA
Bechtel Corporation:
$200 million savings with investment of $30
million
It is high time, that Indian Companies also start
implementing Six Sigma for making breakthrough
improvements and to remain globally competitive.
Quality and Value
Attempting to Define
Quality
Experts definitions of quality fall into two categories:
Level one quality is a simple matter of producing
products or delivering services whose measurable
characteristics satisfy a fixed set of specifications
that are usually numerically defined.
Independent of any of their measurable
characteristics, level two quantity products and
services are simply those that satisfy customer
expectations for their use or consumption.

In short, level one quality means get it in the specs,


and level two means satisfy the customer.
Quality Gap
Customer
Expectations
Understanding
Understanding of
the Gap
Needs

Design of Design Gap


Products
Quality
Gap
Capability to Process Gap
Deliver Design

Operations Gap
Actual Delivery

Customer Perception Gap


Perception of
Delivery
Nine Dimensions of
QUALITY
According to modern management
concepts, quality has nine
dimensions:

1) Performance: main
characteristics of
the product/service
2) Aesthetics: appearance, feel,
smell,
taste
3) Special features: extra
Nine Dimensions of
QUALITY
4) Conformance: how well the
product/service
conforms to
customers
expectations
5) Safety: risk of injury
6) Reliability: consistency of
performance
Nine Dimensions of
QUALITY
7) Durability: useful life of
the
product/service
8) Perceived Quality: indirect
evaluation
of quality (e.g.
reputation)
9) Service after Sale: handling of
Customer
complaints and
checking
Evolution of Quality

Reactive Quality
Proactive Quality
Quality Checks
Create process
(QC) -
that will produce
Taking the
less or no
defectives out of
defects
what is produced

Historicall Contempora
y ry
Old Concept Of Quality
Past concepts of quality focused on
conformance to standards. This definition
assumed that as long as the company
produced quality products and services,
their performance standard was correct
regardless of how those standards were
met. Moreover, setting of standards and
measurement of performance was mainly
confined to the production areas and the
commercial and other service functions
were managed through command control.
Value Enrichment
The term Value Enrichment for
the company means that they
must strive to produce highest
quality products at the lowest
possible costs to be competitive in
the global markets.
For customers, the term Value
Enrichment means that they have
the right to purchase high quality
products/services at the lowest
cost.
Concept Of Value
Value
to Customers

Value

Price
Real
+
+
Inconvenie
Perceived
nce
Definitions
VALUE:
THOSE ACTIVITIES THAT CONVERT
MATERIALS OR IDEAS INTO GOODS
OR SERVICES THAT GENERATE CASH
Definition

ANYTHING THAT
IS NOT VALUE IS
WASTE
Six Sigma and Cost Of
Quality
Six Sigma has a very significant impact on
the cost of quality. As the Sigma level
moves up, the cost of quality comes down
and vice versa. Traditionally recorded quality
cost generally account for only 4 to 5
percent of sales which mainly comprise of
scrap, re-work and warranty.

There are additional costs of quality which


are hidden and do not appear in the account
books of the company, as they are intangible
and difficult to measure.
Visible And Hidden Costs
Scrap
Visible Rework
Costs Warranty Costs

Conversion
efficiency of
Hidden materials
Inadequate
Costs resources
utilization
Excessive use
of materials
Cost of re-
design and re
inspection
Cost of
resolving
customer
problems
Lost customers
/ Goodwill
High Inventory
Cost OF Quality At Various Levels
Of Sigma
Defect
Cost Of Competitive
Sigma Rate
Quality Level
(PPM)

6 3.4 <10% World


Class
5 233 10-15%

4 6210 15-20%
Industry
Average
3 66807 20-30%

2 308537 30-40% Non


Competiti
1 6,90000 >40% ve
What is The Cost Of
Quality?
Cost of Quality: the cost of
ensuring that the job is done
right + the cost of not doing the
job right.

(Prevention
Cost ofand
Conformance + Cost(Internal/External
of Non-
Appraisal) Defects)
Conformance
Cost Of Quality
Direct
Costs
Prevention Costs Appraisal
AppraisalCosts
Costs
Quality Planning Source
SourceInspection
Inspection
Process Evaluation / InIn/ /End-Process
End-Process
Improvement Inspection
Inspection
Quality Improvement Calibration
Calibration
Meetings Specialist
SpecialistCost
Cost
Quality Training

Internal Failure Costs External


ExternalFailure
FailureCosts
Costs
Rework / Correction Complaint
ComplaintHandling
Handling
Re-Inspection Rework
Rework/ /Correction
Correction
Internal Reject Re-Inspection
Re-Inspection
Loss of Business
PHASES OF SIX
SIGMA
Fundamental Steps
There are 5 fundamental Steps
involved in applying the
breakthrough strategy for
achieving Six Sigma. These steps
are :-
Define
Measure
Analyze
Improve
Control
Define Phase
This phase defines the project. It
identifies critical customer
requirements and links them to
business needs. It also defines a
project charter and the business
processes to be undertaken for Six
Sigma.
Define
Defin
e D M A I C
Define Activities
Define Activities
Identify Project, Champion and Project
Owner
Determine Customer Requirements
and CTQs
Define Problem, Objective, Goals and
Benefits
Define Stakeholders/Resource
Analysis
Map the Process
Develop
Define Project
QualityPlan
Tools
Project Charter and Plan
Effort/Impact Analysis
Process Mapping
Tree Diagram
VOC
Kano Model
Pareto Analysis
Measurement Phase
This phase involves selecting
product characteristic, mapping
respective process, making
necessary measurements and
recording the results of the
process. This is essentially a data
collection phase.
Measure Operational Definition

Measu
re D M A I C

Measure Activities
Determine operational Definitions
Establish Performance Standards
Develop Data Collection and Sampling
Plan
Validate the Measurements
Measurement System Analysis
Determine Process Capability and
Baseline

Measure Quality Tools


Measurement Systems Analysis
Check Sheet
Process Capability
Process FMEA
Analysis Phase
In this phase an action plan is
created to close the gap
between how things currently work
and how the organization would
like them to work in order to meet
the goals for a particular product
or service. This phase also requires
organizations to estimate their
short term and long term
capabilities.
Analyze
Analyz
e
D M
A I C

Analyze Activities
Benchmark the Process or Product
Analysis of the Process Map
Brainstorm for likely causes
Establish Causal Relationships Using
Data
Determine Root Cause(s) Using Data

Analyze Quality Tools


Cause and Effect or Event Diagram
Graphical Analysis
Statistical Analysis of Data
Hypothesis Testing
Correlation Regression
DOE
Improvement Phase
This phase involves improving
processes/product
performance characteristics for
achieving desired results and
goals. This phase involves
application of scientific tools and
techniques for making tangible
improvements in profitability and
customer satisfaction.
Improve
Improv
e D M A I C

Improve Activities
Develop Solution Alternatives
Assess Risks and Benefits of Solution
Alternatives
Implement error-proofing solutions
Validate Solution using a Pilot
Implement Solution
Determine Solution Effectiveness using
Data

Improve Quality Tools


Brainstorming
FMEA
Risk Assessment
Poka Yoke
Control Phase
This Phase requires the process
conditions to be properly
documented and monitored
through statistical process control
methods. After a setting in
period, the process capability
should be reassessed. Depending
upon the results of such a follow-
up analysis, it may be sometimes
necessary to revisit one or more of
the preceding phases.
Control Develop
Standards
Contro
l
D M A I
C
Control Activities
Determine Needed Controls (measurement,
design, etc.)
Implement and Validate Controls
Develop Transfer Plan
Realize Benefits of Implementing Solution
Institutional Changes
Close Project and Communicate Results

Control Quality Tools


Statistical Process Control
Process Map and FMEA
Control Plans
5S
Control Charts
Six Sigma
Projects
Why Project Selection is
Important?
High leverage projects lead to
largest Savings
Large returns are expected by
management to justify the
investment in time and effort
Developing a Six Sigma culture
depend upon successful projects
having significant business
impact
How To Focus Projects

Process Cost Savings Focus


Project Quality focus
Product focus (Six Sigma Design)
Problem Focus (Least Desirable
Use)
Project Selection
Align with company objectives and
business plan (Annual Operating Plan)
Voice of Customer/CTs Inputs
Quality (CTQ)/Cost (CTC)/ Delivery (CTD)
PPM / COPQ / RTY / Cycle Time
Consistent with principles of Six Sigma
Eliminate process defects
Concentrate on Common
issues/opportunities not fir-fighting
Large enough to justify the investment
Project Desirability
Effort Required:- includes time
required of team members and
expenditure of money.
Probability of Success:- An
assessment that takes into account
various risk factors:
+ Time uncertainty of the
completion date
+ Effort uncertainty of the
investment required
+ Implementation uncertainty
of roadblocks
Project Desirability Matrix

I Low
Hi D nc
es re
ir as Me
IMPACT
IMPACT

ab e
Me ili d d
d ty

Hi
Low

Me
Low Hi
d

EFFORT
Additional Project Considerations
Projects must serve as a learning experience
for Green Belts to use the six Sigma tools
Projects scope should not be too large or
take too long to implement
Projects scope should be manageable and
take at least 255 of the potential Green
Belts time.
Pareto Chart may be used to Scope the
Project
Desirable to have a measurable variable for
the primary project output/metrics
Additional Project Considerations
Projects must serve as a learning
experience for Green Belts to use the
six Sigma tools
Projects scope should not be too large
or take too long to implement
Projects scope should be manageable
and take at least 255 of the potential
Green Belts time.
Pareto Chart may be used to Scope the
Project
Desirable to have a measurable
variable
DOfor
NOTthetry
primary project
to Solve World
output/metrics Hunger
Strategy At Various Levels
Almost every Organization can be
divided into 3 basic levels:-
1. Business level
2. Operations level
3. Process level.
It is extremely important that Six
Sigma is understood and
integrated at every level.
Strategies At Various
Levels
Executives at the business level can use
Six Sigma for improving market share,
increasing profitability and organizations
long term viability.
Managers at operations level can use Six
Sigma to improve yield and reduce the
labor and material cost.
At the process level engineers can use
Six Sigma to reduce defects and
variation and improve process capability
leading to better customer satisfaction.
Factors To Control in
Improvement Project
Resources
Team availability
The right tools
Schedule
Be realistic
Be aggressive
Get buy-in
Scope of Work
Watch for scope creep
Stay focused
Anticipate and mitigate risk

Control any two areas, the third floats in


response
Meetings Make Them
Effective
Defined goal for meeting
Notice and agenda
Decision makers prepare and participate
Action Items
Records
Balance Sheet
Focused on process, not topic
What helped us get to our goal
What could have been better
Take appropriate action
Skills Needed
People
Leadership behaviors
Communication
Process
Time Management
Schedule Coordination
Problem Solving
Risk analysis and mitigation
Tactical Planning
Technical
Six Sigma / Lean Tools
Business Knowledge
Voice of Customer
CTP and CTQ
Establishing Customer
Focus
Customer Anyone internal
or external to the
organization who comes in
contact with the product or
output of work
Quality performance to the
standard expected by the
Customer
Variation is the Enemy in
Achieving Customer Satisfaction

Uncertainty
Variatio Unknown
Disbelief
n Risk
Defect Rate
What is Variation

Variation is any deviation from the


expected outcome.
Something more on Variation
Any process has variation
There are two kinds of variation
Common cause variation
Special cause variation
Variationis measured in terms of
sigma or standard deviation.
Variation and Standard Deviation
If a good deal of variation exists in a process
activity, that activity will have a very large
standard deviation.
As a result, the distribution will be very wide
and flat.

Less Variation More Variation


Types of Variation
Special Cause: something different happening
at a certain time or place

Common Cause: always


present to some degree in the
process
We tamper with the system if we treat all variations as if it
were special cause
Dealing with Variation
Eliminate special cause
variation by recognizing it and
dealing with it outside of the
process
Reduce common cause variation
by improving the process
Whom would you Prefer?

Operator - 1 Operator - 2
Critical To Quality (CTQ)
are the key measurable
characteristics of a product or
process whose performance
standards or specification limits
must be met in order to satisfy
the customer.

They align improvement or design


efforts with customer
requirements.
Critical To Quality (CTQ)
1. To put it in laymans terms,
CTQs are what the customer
experts of a product...
2. ...the spoken needs of the
customer.
3. The customer may often express
this in plain English, but it is up
to us to convert them to
measurable terms using tools
such as QFD, DFMEA, etc.
Critical To Quality (CTQ)
1. List customer needs.
2. Identify the major drivers for
these needs (major means those
which will ensure that the need
is addressed).
3. Break each driver into greater
detail.
4. Stop the breakdown of each
level when you have reached
sufficient information that
Example CTQ Tree
Need Drivers CTQs
Operator Training
Time (hrs.)
Ease of Operation
Setup Time
(minutes)

Ease of Operation Accuracy


Operation (errors/1000 ops)
and
Maintenan
ce Mean Time to
Restore (MTTR)

Ease of # Special Tools


Maintenance Required

Maintenance Training
Time (hrs.)

General Specific
Hard to Measure Easy to Measure
PROJECT CHARTER
Importance of Project
Charter
A project charter is a written document and
works as an agreement between
management and the team about what is
expected.

The charter:
Clarifies what is expected of the team.
Keeps the team focused.
Keeps the team aligned with
organizational priorities.
Transfers the project from the
champion(s) to the project team.
Team Charter
Problem Statement
Currently we carry out reblows to the extent of
about 11-15% resulting in lower converter life,
lower productivity of converter and increased
Ferro-alloy and oxygen consumption.
Scope
All batches and all converters in SMS 1.
Project Goal and Measures
Reblows should be less than 7.5% and 9%.
Expected Business Results
We hope to save Rs. Xxxxx lakhs per year due
to this reduction in reblows.
Team Charter
Team Members
Supervisor, two operators, technical
services, quality control
Support Required
Allow for weekly team meetings
Team budget for quick wins
Schedule
Measure (7wks), Analyze (4wks),
Improve (6wks), Check (2wks), Control
(1wk), Standardise/Close (1wk)
Usual elements of a Project
Charter
Project Description Business
Case
Scope Process/Product
Goals and Measures (Key
Indicators)
Expected Business Results
Team Members
Support Required
Expected Customer Benefits
Schedule
MEASURE OVERVIEW
Measurement Objective

The Measure phase aims to set a


baseline in terms of process
performance through the
development of
clear and meaningful measurement
systems
The Measurement Process
Baseline
Develop
Collect Check Understan Process
Process
Process Data d Process Capabilit
Measure
Data Quality Behavior y and
s
Potential
What is the
current
How do When and Does the How does performance
you where data the of the
measure does the represent process process with
the data what you currently respect to
problem? come think it behave? the
from? does customer
TOOLS AND TECHNIQUS OF MEASURE
Statistics Data MSA Distribution Process
Operation Collectio Gage s Capability
al n R&R First pass Cp, Cpk
Definition Methods yield DPMO
s Data Short/long
Data Collectio term
Worlds n Plans variation
Samplin
Statistical and Data World
If the Attribut
Continuous Count
data is e

Relevant Poisson Binomial


Normal
statistical Distributi Distributi
Distributi
model is on on
on

When Not always Always Always


does the validity of Poisson Binomial
statistical normality if process if process
model needs to be is in is in
apply checked control control
Common Average Defects Percentag
statistics (mean), per Unit e
are Standard (DPU) (Proportio
Deviation n)
(sigma)
Basic Statistics
Statistics
The science of:
Collecting,
Describing
Analyzing
Interpreting data...

And Making Decisions


What are Statistics?
Descriptive Statistics
Summarize and describe a set of data
Mean, median, range, standard deviation,
variance, ....
Analytical Statistical (or Statistics)
Techniques that help us make decisions in the
face of uncertainty
Use concepts of descriptive statistics as a
base
Hypothesis testing, means comparisons,
variance comparisons, proportions
comparisons, ...
Sample Versus Population
Using a small amount of data (Sample)...
to make assumptions (inferences)...
on a large amount of data (population).
Population: the total collection of observations
or measurements that are if interest.
Sample: A subset of observations and
measurements taken form the population.
Why do we use samples?
Time
Cost
Destructive testing (need product left to sell !!)
Other?
Measures of Central
Tendency
What is the Median value of
Distribution?
Median
What value represents the
distribution?
Mode
What value represents the entire
distribution?
Mean (xx )
What is the best measures of
central tendency?
Data Distributions
Mean: Arithmetic average of a set of
values
Reflects the influence of all values
Strongly influenced of all values
Median: Reflects the 50% rank the
center number after a set of numbers
has been sorted from low to high.
Does not include all values in calculation
Is robust to extreme scores
Mode: The value or item occurring most
frequently in a series of observations or
statistical data.
Variable Data Location -
Mean
Month
Jan-2006
# of Units
233
Feb-2006 281
Mar-2006 266

Apr-2006 237 We have data on the monthly demand
May-2006 260 history of one of our key product lines.
Jun-2006 250 Lets calculate the statistics for location.
Jul-2006 237 Mean
Aug-2006 275 Add all of the monthly numbers
Sep-2006 218 Divide by the number of months in
Oct-2006 279
the sample.
Nov-2006 227
N=24,
Dec-2006 246
Jan-2007 258
Feb-2007 272
Mar-2007 229

Apr-2007 240 Our average monthly shipment is 253


May-2007 287 units
Jun-2007 260
Jul-2007 251
Aug-2007 288
Populati Sample
Sep-2007 256
Oct-2007 219 on
Nov-2007 260
Dec-2007 249

n=24
Variable Data Location -
Median
Month
Jan-1999
# of Units
233
Month
Sep-1999
# of Units
218

Feb-1999

Mar-1999
281

266
Oct-2000

Nov-1999
219

227
Statistics for
Apr-1999 237 Mar-2000 229 location
May-1999 260 Jan-1999 233

Jun-1999 250 Jul-1999 237 Median (x)


Jul-1999 237 Apr-1999 237 Sort the data
Aug-1999 275 Apr-2000 240
Sep-1999 218 Dec-1999 246 from lowest to
Oct-1999 279 Dec-2000 249
highest
Nov-1999 227 Jun-1999 250
Dec-1999 246 Jul-2000 251 If there is an
Jan-2000

Feb-2000
258

272
Sep-2000

Jan-2000
256

258
even number of
Mar-2000 229 May-1999 260 observations,
Apr-2000

May-2000
240

287
Jun-2000

Nov-2000
260

260
the median is
Jun-2000 260 Mar-1999 266 the average of
Jul-2000
Aug-2000
251
288
Feb-2000
Aug-1999
272
275
the two middle
Sep-2000 256 Oct-1999 279 values
Oct-2000 219 Feb-1999 281
Nov-2000 260 May-2000 287
Dec-2000 249 Aug-2000 288
Variable Data Location -
Median
Month
Jan-1999
# of Units
233
Month
Sep-1999
# of Units
218

Feb-1999

Mar-1999
281

266
Oct-2000

Nov-1999
219

227
Statistics for
Apr-1999 237 Mar-2000 229 location
May-1999 260 Jan-1999 233

Jun-1999
Jul-1999
250
237
Jul-1999
Apr-1999
237
237
Mode
Aug-1999 275 Apr-2000 240 The most
Sep-1999 218 Dec-1999 246 frequently
Oct-1999 279 Dec-2000 249
occurring
Nov-1999 227 Jun-1999 250
Dec-1999 246 Jul-2000 251
value is the
Jan-2000 258 Sep-2000 256 mode
Feb-2000 272 Jan-2000 258
Mar-2000 229 May-1999 260

Apr-2000 240 Jun-2000 260

May-2000 287 Nov-2000 260


Jun-2000 260 Mar-1999 266

Jul-2000 251 Feb-2000 272


Aug-2000 288 Aug-1999 275

Sep-2000 256 Oct-1999 279

Oct-2000 219 Feb-1999 281


260 is the
Nov-2000 260 May-2000 287
mode
Dec-2000 249 Aug-2000 288
Variable Data Location -
Mode
Notes on mean
A measure of central tendency
Limitations:
Reflects the influence of all values
Strongly influenced by extreme values
Median (the centre number after sorting high to
low) is robust to extreme values.
Variable Data Description
Range, Standard Deviation
Month # of Units

Jan-2006 233
Feb-2006 281

Mar-2006 266

Apr-2006 237 Lets use this same data to


May-2006
Jun-2006
260
250
calculate the statistics for
Jul-2006 237 dispersion
Aug-2006
Sep-2006
275
218
These statistics are
Oct-2006 279 Range and Standard
Nov-2006 227

Dec-2006 246 Deviation


Jan-2007 258

Feb-2007 272

Mar-2007 229
Apr-2007 240

May-2007 287

Jun-2007 260

Jul-2007 251
Aug-2007 288

Sep-2007 256

Oct-2007 219
Nov-2007 260

Dec-2007 249
Example commuting
time
Commute time (mins)
19.5 22.4 20.7 18.8 18.2
20.0 19.6 19.8 21.0 19.8 Collect over a hundred
20.7 21.9 22.0 22.6 19.4 occurrences.
22.8 18.1 17.5 21.3 19.1
18.4 19.8 21.0 18.5 19.2
Tabulate in chronological
19.2 19.4 19.3 24.8 21.2 order.
21.2 18.3 18.2 17.4 19.9
21.0 18.9 16.4 17.6 19.5
Does the data show
19.2 23.9 20.6 21.9 18.7 variation?
19.5 20.1 17.1 22.1 19.2
Can you make out
19.6 20.3 20.8 20.7 22.4
19.9 21.1 20.4 16.7 19.1
anything with this
18.3 22.4 27.1 17.6 18.8 arrangement of data?
22.5 19.9 21.8 20.4 17.7
Let us try and make some
21.3 17.8 18.7 15.8 18.9
21.7 20.1 19.6 18.4 21.7
sense of this data
18.7 18.8 20.5 18.6 20.9
22.0 15.8 19.4 20.2 18.7
23.6 21.0 19.9 20.1 18.3
21.9 19.7 21.1 19.9 22.9
Measure of variation Standard
Deviation and Range
Summary for Commute Anderson Darling Normality Test
A-Squared 0.42
time P-Value 0.312
Mean 20.006
St.Dev 1.884 One
Variance 3.550 measure of
Skewness 0.54470
Kurtosis 1.30256
variation
N 100 (std. dev)
Minimum 15.754
1st Quartile 18.714 Another
Median 19.819
3rd Quartile 21.186
Measure of
Maximum 27.054 variation
15 18 Category
20 22
1
24 25 95% Confidence Interval for Mean (Range)
19.632 20.380
* 95% Confidence Interval for Mean

. .
19.448 20.263
95% Confidence Interval for Mean
Mea 1.654 2.189
n
Media
n
Outlier
19.50 19.75 20.00 20.25 20.50

What are the relative merits and demerits of standard deviation


over range?
Variable Data Dispersion
Standard Deviation
s or standard deviation
What does it mean?
Standard deviation is a measure of
dispersion (or how our data is spread
out).
Range will tell us the difference
between the highest and lowest
values in a data set, but nothing
about how the data are distributed.
We need deviation to statistically
describe the distribution of values.
Variable Data Dispersion
Standard Deviation
How
we calculate it
A measure of how far each point deviates
from the mean
We square each distance so that all the
numbers are positive
The sum of the squares, divided by the
sample size, is equal to the variance
The square root of the variance is the
standard deviation
Variance can be added; standard
deviations cannot

Population Sample
Variable Data Dispersion
Standard Deviation Calculation
Month # of
Units
-20.25 -20.25 Square
Jan- 233 Subtract the
2006 27.75 27.75 each
mean from
subtraction
Feb-
2006
281 12.75 12.75
each value
-16.25 -16.25 result
Mar- 266
2006 6.75 6.75

Apr- 237 -3.25 -3.25


2006 -16.25 -16.25
Sum the Squares
May- 260 21.75 21.75
2006
Jun- 250
-35.25 -35.25

Calculate 2006 25.75 25.75

the Mean Jul-2006 237 -26.25 -26.25


Aug- 275 -7.25 -7.25
2006
4.75 4.75
Sep- 218 Calculate the
18.75 18.75
2006
Denominator
Oct- 279 -24.25 -24.25
n= 2006
-13.25 -13.25
24
Count Nov-
2006
227
33.75 33.75
the 6.75 6.75
Dec- 246
Samples 2006 -2.25 -2.25
Complete the
Jan- 258 34.75 34.75
Calculation
2007
2.75 2.75
Feb- 272
2007 -34.25 -34.25

Mar- 229 6.75 6.75


2007
-4.25 -4.25
Variable Data Dispersion
Standard Deviation

Standard deviation of a population


If your data is from a population versus
a sample from a population, use this
formula to calculate standard deviation
The difference is the denominator
N versus n-1
Fundamental Topic
The Normal Curve
Processes have natural variation
Many processes behave normally
Characterized by Bell Shaped Curve
Mean near peak
Histogram of Diameter, with Normal
Curve is symmetric Curve
Mean
Standard Deviation

Frequency

Diameter
Measures of Variability
The Range is the distance between the
extreme values of data set. (Highest
Lowest)
The Variance(S ) is the Average Squared
Deviation of each data point from the
Mean.
The Standard Deviation (s) is the Square
Root of the Variance.
The range is more sensitive to outliners
than the variance.
The most common and useful measure of
variation is the Standard Deviation.
Sample of Statistics versus
Population Parameters

= Population
X = Sample Mean
Mean

s = Sample = Population
Standard Standard
Deviation Deviation

Statistics Estimate Parameters


Statistical Calculation
(Sample)
Mean Variance

Standard Deviation

Standard Deviation
n
2 1.12
3 8
4 1.69
6 3
2.05
Statistical Calculation
(Population)

Mean Variance

Standard
Deviation
Normal Distribution
LOCATION
Description of a NORMAL
DISTRIBUTION

LOCATION:
The Central Tendency
It is usually expressed as
the AVERAGE

SPREAD:
The dispersion
It is usually expressed as SPREAD
standard deviation
(Sigma)
Properties of Normal Distribution

Normal Distribution is Symmetric


Has equal number of points on
both sides
Mean Median and Mode Coincide
Normal Distribution is Infinite
The chance of finding a point
anywhere on the plus and minus
side (around the mean) is not
absolutely Zero.
Properties Of Normal Distribution

Normal Curve & Probability Areas

-3
-2
-1
0 1 2 3

68%

95%
99.73
%
Lets Summarize

We need data study, predict and


improve the processes.
Data may be Variable or Attribute.
To understand a data distribution,
we need to know its Center, Spread
and Shape.
Normal Distribution is the most
common but not the only shape.
Standard Deviation -
Graphically
Month
Jan-1999
# of Units
233
Feb-1999 281
Mar-1999 266
Lets take our demand data and
Apr-1999 237 develop a histogram
May-1999 260
Jun-1999 250
1. Set up the scale and limits per
Jul-1999 237
subdivision
Aug-1999 275
2. Plot the count of values that fall within
Sep-1999 218
each subdivision on the scale
Oct-1999 279 5
Nov-1999 227
4
Frequency
Dec-1999 246
Jan-2000 258
3
Feb-2000 272
Mar-2000 229
2
Apr-2000 240
May-2000 287 1
Jun-2000 260
Jul-2000 251 0
Aug-2000 288
Sep-2000 256
Oct-2000 219
Nov-2000 260
Monthly Demand in Units
Dec-2000 249
Standard Deviation -
Graphically
If my data is
5normal
1
1
1
1
1
1

3
Frequency

Monthly Demand in
Standard Deviation -
Graphically
If my data is
5 normal
3

4 2

3 1

Frequency

Monthly Demand in
Standard Deviation Simple
Application
I have a process with mean of 43 and a standard
deviation of 3

1400 68.3% of the data


lies between what
1200
points?

95.4% of the data


1000
lies between what
Frequency

points?
800
99.7% of the area
600
lies between what
points?
400

200

51
35

36

37

38

39

40

41

42

46

50
49
43

44

45

47

48
Standard Deviation Simple
Application
I have a process with mean of 43 and a standard
deviation of 3
68.3%
of the data
lies between 40 and
45

1400
95.4% of the data
1200 lies between 38 and
47
1000

99.7% of the area 1


Frequency

800
lies between 36 and
49
600
2

400

200
3

51
35

36

37

38

39

40

41

42

46

50
49
43

44

45

47

48
Standard Deviation Class
Exercise
What is the probability that a
random sample taken from this
process
68.3
%
95.4
Will have a value between 40 and45?
%
99.7
Will have a value between 36 and48?
%
Will have a value between 33 and 51?
Probability theory
And Probability
Distribution
Probability
What is the role of Probability in
Statistics?
Any conclusion we reach on a
population, based on what we
know about a sample, is subject
to uncertainty.
This uncertainty is calculated and
described using probability
theory
Every output (response) from a
Probability Measure
Every event (=set of outcomes) is
assigned a probability measure.
The probability of every set is between
0 and 1, inclusive.
The probability of the whole set
outcomes is 1.
If A and B are two event with no
common outcomes, then the probability
of their union is the sum of their
probabilities.
Probability Measure
Probability of an event A = P (A)
P (A) =

Cards
Events: a red card (1/2); a jack
(1/13)
Chances of calling correctly on
toss of a coin is i.e. 0.5
Probability
Building an Understanding
Well start with a pair of dice
Our customer will only accept
combinations that equal
3,4,5,6,7,8,9,10 and 11.
What is the probability of meeting his
requirement?
Probability
Building an
The customer defines a response of 2 or 12 as
Understanding
a defect
Calculate all possible How many total combinations
responses from the exist?
combinations of inputs
Die 1 Roll
1 2 3 4 5 6 How many times is my response a
2?
1 2 3 4 5 6 7 What is the probability of a

response of 2?
Die 2 Roll

2 3 4 5 6 7 8
3 4 5 6 7 8 9
How many times is my response a
4 5 6 7 8 9 1 12?
0 What is the probability of a
response of 12?
5 6 7 8 9 1 1
0 1
What is the probability of a defect?
6 7 8 9 1 1 1 (2 or 12)

0 1 2
Probability
Building an
Understanding Another example
Die 1 Roll What is the probability of
1 2 3 4 5 6 rolling a 7 using a fair pair
of dice?
1 2 3 4 5 6 7 Die 1 Die 2 Probabili
ty
2 3 4 5 6 7 8
Die 2 Roll

1 6 0.0278
3 4 5 6 7 8 9
2 5 0.0278
4 5 6 7 8 9 1
0 3 4 0.0278

5 6 7 8 9 1 1 4 3 0.0278
0 1 5 2 0.0278
6 7 8 9 1 1 1 6 1 0.0278
The probability of each
0 1 2 Total 0.1668
roll is included in each
block
16.68% Probability
Probability
Probability of any given value Value Frequenc Probabili
(Respons y ty
on Die 1 e)
2 1 0.0278
3 2 0.0556
4 3 0.0833
Probability of any given value
on Die 2 5 4 0.1111

6 5 0.1389
7 6 0.1667
Probability of any given 8 5 0.1389
combination 9 4 0.1111

10 3 0.0833
11 2 0.0556
12 1 0.0278
Total 1.0000
Probability
18
0.1667
16 2
14
0.1389 0.1389 3
Probability

4
12 0.1111 0.1111 5
10 6
0.0833 0.0833 7
8 8
6 0.0556 0.0556 9
This represents the 10
4 11
0.0278 response of our 0.0278
2 system (In 12
Probability)
0
Response (Dice
Total)
Probability
Our customer will only accept combinations that
equal 3,4,5,6,7,8,9,10,11 Value Frequen Probabili
(Response cy ty
We have a 99.44% probability of )
meeting the customers specification 2 1 0.0278
The curve of this distribution becomes3 2 0.0556
its Probability Density Function 4 3 0.0833
5 4 0.1111

20 6 5 0.1389
2 3
LSL USL
7 6 0.1667
15 4 5
Probability
Probability

8 5 0.1389
6 7
10 9 4 0.1111
8 9
5 10 11 10 3 0.0833
12 11 2 0.0556
0
Response (Dice 12 1 0.0278
Total)
Probability
Our customer will only accept combinations that
equal 3,4,5,6,7,8,9,10,11 Value Frequen Probabili
(Response cy ty
We have a 99.44% probability of )
meeting the customers specification 2 1 0.0278
The curve of this distribution becomes3 2 0.0556
its Probability Density Function 4 3 0.0833
5 4 0.1111

20 6 5 0.1389
LSL USL
7 6 0.1667
15
Probability
Probability

8 5 0.1389
10 9 4 0.1111
5 10 3 0.0833
11 2 0.0556
0
Response (Dice 12 1 0.0278
Total)
Probability Theory
What is a Probability Density Function?
A Mathematical Function
It models the probability density reflected in a histogram
With more observations
Class intervals become narrower and more numerous
The histogram of the variable takes on the appearance of a
smooth curve
The total area under the curve must equal 1.
The probability that a random variable will assume
a value between any two points is equal in value to
the area under the random variables probability
density function between these two points.
What does this mean
h
to us?
Probability Theory
This histogram has 24 points distributed over 12
intervals
4.5
1
4
2
3.5
3
3 4
Frequency

2.5 5
2 6
7
1.5
8
1 9
0.5 10
0 11
Response Intervals
Probability Theory
As the number of data increase, the intervals
get smaller

40
0

30
0
Frequency

20
0

10
0

Response Intervals
When we do this, the curve outlining the data gets
smoother
Probability Theory
What do we know about Probability
Distribution?
The area under the curve always equals 1
We can determine the probability that a
value of a random variable will fall between
2 points on the curve by calculating the area
under the curve between the two points

Why would we
How do we do
want to do
this?
this?
Using Probability
Distribution
The
Standard Normal Distribution
Letstake a look at the most important
PDthe standard normal distribution
We can transform each point on our
normal curve into a standard normal
curve value using the Z transform
Using Probability
Distribution
The Standard Normal Distribution
Standard Normal Curve Characteristics

It has a standard
It has a mean of deviation of 1.0
0.0

After the
Z
Transform
The
Original
Distributi The area under the
on curve equals 1
The curve is
symmetrical


Using Probability
Distribution
The Standard Normal Distribution
The How
Find the points on the Standard
Normal Distribution that correspond to
your values
Determine the area under the
standard normal curve that is between
the points you
If our data have
is normal, found
we can use
the Standard Normal Distribution
the Standard Normal Distribution

This saves us from having to do the


calculation for each specific
situation!
Using Probability
Distribution
The Standard Normal Distribution
A Why Example:
The unit sales of Product A follows a
normal distribution and has a
monthly average of 253 units with a
standard deviation of 21 units
What is the

probability that = 253
next months sales S = 21
will be greater than
300 units?
Using Probability
Distribution
The
Standard Normal Distribution
What is the probability that next
months unit sales will be greater
than 300?
1. Find the point on the Standard
Normal Distribution that corresponds

to 300
=25 S = 21
This is telling us that 300 is 2.24 standard deviations
from the mean
Using Probability
Distribution
The Standard Normal
Distribution
2) Determine the area under the
standard normal curve that is to
the right of 2.24
How?
Use the Table of the Standard Normal
Distribution

2.24
Z
was
2.24

Standard Normal
Table
Using Probability
Distribution
The Standard Normal Distribution
This table shows the area between 0 (the mean of a standard
normal table) and Z
Because the curve is symmetric
The area of each is 0.500
The area to the right of a positive value is 0.500 minus the
area between 0 and the Z value
For Z = 2.24 (the equivalent of 300)
Locate the row labeled .04
The area is 0.4875
Subtract this area from 0.500
2.24
0.500 0.4875 = 0.0125

I have a 1.25% probability that my unit sales


next month will be greater than 300 units
Normal Distribution
If you know your average value ( ) and
your standard deviation (s) then for a
given specification limit, it is possible to
predict rejections (if any), that will occur
even if you keep your process in control.
Example:
= 2.85, s = 0.02 (The dimensions
relate to a punched part).
Lat us find the percentage rejection if the
specified value is 2.850.04 i.e. the part
is acceptable between 2.81-2.89
Normal Distribution
Applicable in real
life: 2.8
5

2.8 2.8
1 9

Acceptab
Rejectio le Rejectio
ns Range ns
Normal Distribution
Let A, B and C represent the areas under the
curve for the following conditions:
2.8
5

A rejections for undersize


B acceptable range 2.8 2.8
1 9
C rejections for oversize
Total Area = A+B+C
A B C
Total Rejections = A+C
Normal Distribution
We will introduce a concept called Z
which we can use with a one-sided2.8
5

distribution to
determine the area 2.8
1
2.8
9

under A, B and C and


thus the percentage A B C

rejections and acceptable


components.
Normal Distribution
The area from Normal table
corresponding to 2 is 0.02275
Hence Rejection for Over size
(Area C) = 2.275%
Similarly one can find the
rejection for undersize
Discrete Probability
Distributions
Binomial Distribution
When applicable:
When the variable is in terms of attribute data
and in binary alternatives such as good or bad,
defective or non-defective, success or failure etc.
Conditions:
The experiment consists of n identical trials
There are only two possible outcomes on each
trial. We denote as Success(S) and Failure(F).
The probability of S remains the same from trial
to trial and is denoted by p and the probability of
F is q.
p+q = 1
The trials are independent
Binomial Distribution
For
a random experiment of sample size n
where there are two categories of events, the
probability of success of the condition x in one
category (where there is n-x in the other
category) is

,
Where is the probability that the vent
will not occur.

Where
Binomial Distribution
Consider
now that the probability of
having the number 2 appear exactly
three times in seven rolls of a six die is
Poisson Distribution
When applicable:
No. of accidents in a specified period of time
No. of errors per 100 invoices
No. of telephone calls in a specified period of time
No. of surface defects in a casting
No. of faults of insulation in a specified length of cable
No. of visual defects in a bolt of cloth
No. of spare parts required over a specified period of time
The no. of absenteeism in a specified no. of time
The number of death claims in a hospital per day
The number of breakdowns of a computer per month
The PPM of Toxicant found in water or air emission from a
manufacturing plant
Poisson Distribution
Two Properties of a Poisson
Experiment
1) The Probability of an occurrence is
he same for any two intervals of
equal length.

2) The occurrence or nonoccurrence


in any interval is independent of the
occurrence or nonoccurrence in any
other interval.
Poisson Distribution
Conditions:
The experimental consists of counting the
number of times a particular event occurs
during a given unit of time or in a given area
or volume or weight or distance etc.
The probability that an event occurs in a
given unit of time is same for all the units.
The no. of events that occur in one unit of
time is independent of the number that
occur in other units.
The mean no. of events in each unit will be
denoted by .
Poisson Distribution
The
Poisson Random Variable X is the
number of events that occur in specified
period of time.

A company observed that over several years they


had a mean manufacturing line shutdown rate of
0.10 per day. Assuming a Poisson distribution,
determine the probability of two shutdowns
occurring on the same day.
For the Poisson distribution, occurrence/day and
results in the probability
Poisson Distribution
Suppose the number of breakdowns of
machines in a day follows Poisson
Distribution with an average number of
breakdowns is 3.
Find the probability that there will be no

breakdowns tomorrow.
=3
Poisson Distribution
Example: Mercy Hospital

Patients arrive at the emergency


room of Mercy Hospital at the
average rate of 6 per hour on
weekend evenings.
What is the probability of 4
arrivals in 30 minutes on a
weekend evening?
Control Charts
Process Accuracy And
Precision
We have curves that describe our
process
Some questions we may ask
Is my process accurate?
Is my process precise?
Process Accuracy And
Precision
Targe
t
Accuracy describes
centering US
LSL
L
Is my process mean

at my target mean?
Process Accuracy And
Precision LSL
Targe
USL
t

Precision describes
spread
How does the
spread of my
process compare to
the customers
specification limits?
Inaccurate and Imprecise
Accurate and Imprecise
Precise But Inaccurate
Accurate And Precise
Capability
In Statistic Terms
Large Standard
Deviation

Be
tt
er
LSL USL Ca LSL USL
pa
bi
lit
y
Small Standard
Deviation

LSL USL LSL USL

Mean is not centered in Mean is centered in


SPC
PROCESS
The combination of people,
equipment, materials, methods,
measurement and environment
that produce output a given
product or service.

Process is transformation of given


inputs into outputs
SPC
VARIATION
The inevitable differences among
individual outputs of a process.

The sources of variation can be


grouped into two major classes,

Common Causes & Special


Causes
SPC
SPC
SPC
COMMON CAUSE
A source of variation that affects
all the individual values of the
process output being studied

This is the source of the inherent


process variation.
SPC
Common Causes:
1. Plenty in Numbers
2. Results in less Variation
3. Part of the Process
4. Results in constant Variation
5. Predictable
6. Management Controllable
7. Statistics shall apply
SPC
Examples of Common Causes,
Differences in Competency
MAN (setting, operating & inspection)
of Employees working in shifts.

Difference in Quality of Product


when Production of same Part is
MACHINE being carried out as per plan.
UPS provided for Electricity
Supply
Difference in Mechanical &
Chemical Properties in 2 different
MATERIAL lots of Material of same grade
received from suppliers (Raw
Material Manufacturers)
SPC
SPECIAL CAUSE:
A source of variation that affects
only some of the output of the
process; it is often intermittent and
unpredictable. A special cause is
some times called assignable
cause. It is signaled by one or
more points beyond the control
limits or a non-random pattern of
points within the control limits.
SPC
Special Causes:
1. Few in numbers
2. Results in large variation
3. Visitors to the process
4. Variation due to external factors
5. Fluctuating Variation
6. Unpredictable
7. Controllable by Operating personnel
8. Statistics shall not apply
Recognize and deal with special causes outside the (Six
Sigma) process
Implement Corrective and Preventive Action (CAPC)
SPC
Examples of Special Causes,
Untrained Employee working on the
MAN Machine
Production of Product on
Conventional Lathe machine where
Product Run out requirement is 2
MACHINE microns. Major & frequent
breakdowns of Machine. Frequent
Power Failures.
MATERIAL Use of different grade of raw material

Setting of process Parameters which


METHOD are not proven.
Tool breakage

MEASUREMEN Use of Micrometer having range of 0-


T 25 mm to check O.D. of 25 mm 0.1
mm.
Types of Control Charts

VARIABLE ATTRIBUTE

,R p
,s np
, mR c
CUSUM u
Control Charts
Overview
The first step for control charting is to
identify the CTQs of the process which
is required to be brought under control

Types of Control Charts


Depends on the nature of the variable
needed to control:
Variable Control Charts
Attribute Control Charts
CONTROL CHARTS
Variable Control Chart
Variable Control Chart
Xbar Rbar

When to use:
When studying the behavior of a single measurable
characteristic produced in relatively high volumes.

How:
By plotting sample averages (X-bar) and ranges (R) on
separate charts. This allows for independent monitoring of
the process average and the variation about that average.

Conditions:
Constant sample size.
One characteristic per chart.
Should have no less than 20 samples before calculating
control limits.
Variable Control Chart
Xbar Rbar
1. Most common type of control
chart for analyzing continuous
variables.
2. The xbar part of the chart notes
the variation between the
averages of consecutive sub-
groups of data points.
3. The R part of the chart notes the
changes of variation within each
Variable Control Chart
RATIONAL SUBGROUP CONCEPT
Subgroups or samples should be selected so that if
assignable causes are present, the chance for
differences between subgroups will be maximized,
while the chance for differences due to these
assignable causes within a subgroup will be
minimized.
Time order is frequently a good basis for forming
subgroups because it allows to detect assignable
causes that occur over time.
Two general approaches for constructing rational
subgroups:
Construction units of production
Random sample of all process output over the sampling
interval
Control Chart
Reviewing plots & Analysis of trends:
Ensure that all points of both X and R charts
within control limits.
If any point touching to any of the control
limits, review process related remark
corresponding to particular sub-group.
This is assignable cause.
Study particular trends if any
Case study:
Consider process of side member sub-assembly
where critical dimensional characteristics i.e.
concentricity of mounting holes is controlled.
Control Chart
TRENDS ANALYSIS IN SPC CHARTS
ALL POINTS WITHIN CONTROL LIMIT

S.NO Trend Type Meaning Precautions for


better process
control
1. All points within Process under Let process continue.
control limits control, variation due Try to make it a
with zigzag to random causes. natural process
pattern Zigzag pattern
changing with each
point over judgment
2. 7 more Process Centre Do changes to bring
consecutive shifted towards one process to Centre
points on one of the specification
side of center limit
line
3. Cyclic trends Assignable cause Study assignable
happening cause and reason.
periodically Study to prevent
4. Continuous Assignable cause for Study assignable
inclination process drift. If not cause, set process to
towards one of prevented, product prevent drifting
Control Chart
TRENDS ANALYSIS IN SPC CHARTS
ALL POINTS WITHIN CONTROL LIMIT

S.NO Trend Type Meaning Precautions for


better process
control
1. All points Assignable cause Study probable causes
suddenly going present, study for assignable cause
out of control specific process event taking place try to
limits associated with resolve the same
period of specific
point

2. Any point going Process going out of Study the trend type
out of control control due to & establish controls to
limits with assignable cause prevent the
definite trend assignable cause
occurring
Typical Out-Of-Control
Patterns
Point outside control limits
Sudden shift in process average
Cycles
Trends
Hugging the center line
Hugging the control limits
Instability
Shift in Process Average
Cycles
Trend
Control Charts
PURPOSE OF CONDUCTING SPC
STUDIES:
To study and analyze process variation
To find out trends in processes
To identify random & sporadic causes
To manufacture products of consistent
quality
To prevent wastage of material
Process Capability For
Continuous Data
Capability vs Stability
Capability has a meaning only when a process is
stable.
If a process is out of control, first we need to
stabilize the process.
Improvement in the inherent variation can be
made only when the process is stable.
Control Charts are used to study stability.
The first job of Six Sigma practitioner is to
identify and remove Special Causes of Variation.
Once the process is made predictable, the next
job is to identify the causes of inherent variation
and remove them.
Calculating Capability

LSL USL



Calculating Capability

LSL USL


Calculating Capability
Six Sigma Capability

LSL USL



Calculating Capability
Calculate from Upper and Lower side

0 2 4 6 8 10 12 14 16 18 20


Calculating Performance
Calculate from Upper and Lower side

0 2 4 6 8 10 12 14 16 18 20


Calculating Performance

If the formulae are same, what is the difference?


The difference is in Sigma Calculation!
Sigma in Capability covers Short Term Variation.
Sigma in performance covers Long term
Variation.
How is the Data Collection Different?
Process Capability Ratios
Understanding and

Process Real
LSL USL LSL USL Capabilit Capabilit
y y

Increased
Increased Number

Improvement
Improvement
Continuous
Continuous

Defects
Defects

Number of

of

only works for a process that is centered on the target


is a better measure for tracking performance
Capability Indices
Examp
le LSL USL

0.30 0.30 0.31 0.31 0.31 0.31 0.31 0.32


6 8 0 2 4 6 8 0


Capability Indices
Examp
le LSL USL

0.30 0.30 0.31 0.31 0.32


0 5 0 5 0


Lets Summarize
A process cannot be improved till
it is Stabilized.
Capability data should be utilized
for stable processes
Subgroups should contain
consecutive data, not random
data.
Performance calculations should
be done based on large amount
of data representing Long Term
Process Capability For
Attribute Data
Discrete Data Capability
A discrete defect is an attribute, which can
be counted.
Such as:
Scratches, Spots, Dent Marks, Cracks etc.
In these cases does not make sense.
A defect is non conformance to the
standards.
A defective unit can have more than one
defect.
A sample of 100, may have 2 defectives
but 5 defects.
Discrete Data Capability
Defect Opportunities:
Defect opportunities are various types of
defects, that may occur.
These creates dissatisfaction to the
customers.
This is different than defects that occur.
Example : 12 type of defects that can occur
on painted part.
However, on a part produced, we may observe
0 to up to 12 defects.
Thus a part may be defect free or may have1to
12 defects.
Discrete Data Capability
Example:
A sample of 100 nos have been
taken.
Following are the results of
inspection:

No of Defectives 3
No of defects 10
No of Opportunities - 12
Discrete Data Capability
Example:
The capability can be calculated as
follows:
No of units = U =100
Defects = D =10
No of Opportunities = O = 12
Total defect opportunities = UxO =
100x12 =1200
DPO = Defects per opportunity =
10/1200 =1/2 = 0.0083
Discrete Data Capability
Example:
Defect per million opportunities
(DPMO)
=DPO x 1,000,000
=0.0083 x 1,000,000
=8300 DPMO
From the tables, the corresponding
sigma level is 3.9.
Discrete Data Capability

The same formula also can be

expressed as
DPMO
Discrete Data Capability
Example of DPMO

Suppose we observe 200 letters


delivered incorrectly to the wrong
addresses in a small city during a
single day when a total of 200,000
letters were delivered. What is the
DPMO in this situation?
DPMO =

So, for every only million letters delivered this


citys postal managers can expect to have 1,000
letters incorrectly sent to the wrong address.

What is the Six Sigma Level for


this Process?
DPMO Example
IRS tax form advice
Survey of responses indicates
predicted error rate
If 40% then:
DPO = 0.40
DPMO = 0.40 defects/opportunity *
1,000,000 opportunities/million
opportunities
400,000 DPMO = 1.75 Sigma
DPMO Example

Example of Rolled throughput


yield
Ifthere are five
Process No. processes with following
Yield in %
yields: 1 90
2 99
3 95
4 96
5 100

Rolled throughput yield for this process is =


DPMO - Exercise
You have 100 documents
You take a sample of 10 documents
There are 10 opportunities for defect on
each document.
5 defects were found.
Attendance Policy
June 23, 2000
Crane Operational
Excellence Program
All Operational
Excellence Leaders
should be aware.

What is DPMO
Complexity and Capability
Rolled Throughput Yield
Example
StepPayroll
Step and Labor
Step StepTracking Process
1 2 Step Step
3 4
97.4 94.6 5 6
98.0 91.8
% % 95.5% 99.9
% % Transfer Outpu
Read Read
Total
Total hour
%
and and Total t
weekly totals to Create
record record daily
daily daily work
work payroll payroll
= 79.1%
hours generati checks
start start hours
and job on
and and
account system
stop stop
s.
time time
Submit
time
Does complexity
cardhave an important
impact on process capability and
quality?

There are many opportunities for


defects
Complexity and Capability
Rolled Throughput Yield
Example
Payroll and Labor Tracking Process
Step Step Step Step Step Outpu
1 2 3 4 Step 5 6
97.4 94.6 98.0 91.8 95.5%
= t
99.9 79.1%
% % % % %
Our goal, reduce the total number of
opportunities and increase the capability of
remaining opportunities

Step 2 Step 3
Step 1 99.9%
99.6% 99.4%
Print
Scan
Scan payroll
employee
employee checks
badge for
badge and
job card
from = Output
start and
for labor
computer 98.9%
stop time generated
start and
database
stop time
Complexity and Capability
Rolled Throughput Yield
Example
A Three Sigma Process
Outpu
Step 1 Step 2 Step 3 t
93.32% x 93.32%
x 93.32%
= 81.26
%

A Six Sigma Process


Step 2 Step 2 Step 2 Outpu
99.99999 x 99.99999 x 99.99999 = t
7% 7% 7% 79.1%

Notice any
Difference?
Sigma Levels
SIGMA Defect per Million
Opportunities (DPMO)

1
690,000
2
308,537
3
66,807
4 6,210
5 233

6 3.4
Introduction To
Hypothesis Testing
Hypothesis Testing
Concept
Hypothesis testing is one of the most
scientific ways of decision making.
It works very much like a court case.
We have a suspect, we have to take
decision whether He / She is innocent or
guilty.
Suppose there is person charged with
murder, and both sides (defense and
prosecution) do not have any evidence,
what would be decision? Null
Innocent unless proven guilty?
Null
Hypothesis
Guilty unless proven Innocent?
Null Hypothesis
Null hypothesis is represented by
Ho
It is statement of Innocence.
It is something that has to be
assumed if you cannot prove
otherwise.
It is statement of No Change or
No Difference.
Null Hypothesis A Court
Case
Just Like a court case, we first assume the
accused (X) is innocent and then try to
prove it otherwise based on evidence (Data).
If evidence (Data) does not show sufficient
difference, we cannot reject the
innocence(Ho)
But if Evidence (Data) is strong enough, we
reject the Innocence (Ho) and pronounce the
suspect Guilty (Ha).
The statement that will be considered valid
if null hypothesis is rejected is called
Alternate Hypothesis (Ha)
Null hypothesis A
Concept
Hypothesis testing is a philosophy that
real life situations.
You cannot prove two things equal.
You cannot prove two things different by
proving only one difference
If you cannot prove 2 things different,
you have to assume that they are equal.
But if you cannot prove them Different,
are they really Equal?
What is the RISK involved?
Hypothesis Testing
Concept
In Truth, the Defendant is:
: Innocent : Guilty
Correct Decision Incorrect Decision
Innocent
Innocent individual goes Guilty Individual Goes
Verdict

Free Free

Incorrect Decision Correct Decision


Innocent Individual Is Guilty Individual Is
Guilty Disciplined Disciplined
Hypothesis Testing
Concept
True, But Unknown State of the
World is True is True

Correct Decision Incorrect Decision


is True Type II Error Probability
=
Decision
Incorrect Decision Correct Decision
Type I Error Probability =
is True
Hypothesis Testing
Concept
Hypothesis testing Justice System
State the Opposing Conjectures, Ho and
HA.
Determine the amount of evidence
required, n, and the risk of committing a
type error,
What sort of evaluation of the evidence
is required and what is the justification
for this? (type of test)
What are the conditions which proclaim
guilt and those which proclaim
innocence/ (Decision Rule)
Gather & Evaluate the evidence.
What is the verdict? (Ho or HA?)
Determine Zone of Belief : Confidence
Hypothesis Testing
1. Null Hypothesis (Ho) statement of no change or
difference. The statement is assumed true until sufficient
evidence is presented to reject it.
2. Alternate Hypothesis (Ha) statement of change or
difference. This statement is considered true if Ho is
rejected.
3. True I Error the error in rejecting Ho when it is in true
fact, there is no difference.
4. Alpha Risk the maximum risk or probability of making a
Type I Error. This Probability is always greater then zero,
and is usually established at 5%. The researcher makes the
decision to the greatest level of risk that is acceptable for
a rejection of Ho. Also known as significant level.
5. Type II Error The error in failing to reject Ho when it in
fact false, or saying there is no difference when there
really is a differerence.
Hypothesis Testing
Concept
6) Beta Risk The risk probability
or making a Type II Error, or
overlooking an effective treatment
or solution to the problem.
7) Significant Difference The
term where a difference is too
large to be reasonably attributed
to chance.
Risks
Risk
is also called producers risk.
Risk is also called consumers risk.
Can we commit both type I and type II
error at the same time?
As it necessary that we will have both
and risks?
Are and risks equal?
Is and = 1?
Is there any relationship between and ?
Which risk is more important?
Risks
An Risk of 5% is generally
accepted.
An Risk of 10% is generally
Accepted.
Since Ha cannot be proved, our
attempt is to try and reject it.
What risk do we get in trying to
reject the Ho.
Minitab represents risk by p-
panel!
Steps in Hypothesis
Testing
Define Ho
Define Ha
Select Appropriate Test.
Decide Significance Level ( and )
Decide Sample Size
Collect Data
Conduct Test
Interpret!
Define Ho/Ha For following Cases
To find if a distribution is normal or not.
Ho =>
Ha =?
To find if the defects from three machines
are same or different
Ho =>
Ha =>
To find if 2 groups of students from different
streams have differing IQ
Ho =>
Ha =>
Basic Concepts
Statistical Error
Statistical Error Definitions
Alternate Hypothesis
Ha:
Null Hypothesis Ho: Something is
Status quo different
Nothing is different Statement about the
Equality population that
We fail to reject Ho requires strong
based on statistical evidence to prove
evidence If we reject Ho, we in
practice accept Ha.

Beta Risk ()
Alpha Risk ()
Also called a Type II
Also called type I Error
Error
Hypothesis the null
Accepting the null
hypothesis when it is
Hypothesis when it is in
fact true.
fact false.
Statistical Error
Typical
Risks
Typically, the level is set at 0.05
and the level is set at 0.10
They can be set at any level
depending on what you want to
know
The risk is also called the p-value
1- = confidence that an observed
outcome in the sample is real
We typically look for a p-value of 0.05
because:
The Central Limit Theorem
Normally
Why are distributions normal?
When all factors are random
Some measurements are actually
averages over time of micro-
measurements
In other words, what we see as a
measurement is
actually an average
The Central Limit Theorem explains why
a distribution of averages tends to be
normal
Confidence
Sample statistics estimate the mean or standard deviation of a
population
The True population mean and standard are unknown

Confidence limits, levels, and intervals are used to determine


the population statistics

For Standard
For means Deviations
We use t distribution to
We use the distribution
calculate limits, levels,
to calculate limits, levels,
and intervals
and intervals
Definition
Confidence Level:
The level of risk we are willing to take
How sure we want to be that the population mean or standard
deviation falls between the confidence level is typical
95% confidence level is typical.
95% chance that the population mean or standard deviation
falls between the limits.
5% chance (alpha risk) that the population mean or standard
deviation isnt contained within the calculated limits.

Risk () Risk ()
Risk ()

Definition
Confidence Limit
Upper and Lower limits that bracket the
true mean or standard deviation of a
population
Calculation from the sample data and the
appropriate test statistic.
Test statistic is dependent on the risk we
accept that our results will be wrong.
Definition
Confidence Interval
The interval defined by the upper
and lower confidence limits.
A range of values based on
Sample mean or sample standard deviation
Sample size
Confidence level
Appropriate test statistic
Contains
Population mean or
Population standard deviation
Basic Concepts
Confidence Limits
For Means
Confidence Limit Formulas
Means

- Lower Confidence Limit


+ Upper Confidence Limit


Confidence Limit Formulas
Means

Confidence Interval
Lower Confidence Limit Mean Upper Confidence
Limit


- + ()
Confidence Limit - Example
The tensioning device (rubber band) used on the Silobuster has come

under scrutiny
Two sets of tensioners are measured and descriptive statistics are
run. What is the 95% confidence interval for the variation?

- + ()
Set 1 Set 2
Mean: 0.250 Mean: 0.250
Standard Deviation: Standard Deviation:
0.005 0.005
Sample Size: 25 Sample Size: 100

We are 95% confident that We are 95% confident that


the interval 0.2479 to 0.2521 the interval 0.2490 to 0.2510
brackets the true process brackets the true process
standard deviation (0.0042 standard deviation (0.0020
width) width)
Basic Concepts
Confidence Limits
For Standard
Deviation
Confidence Limit Formulas
Variation
Confidence
Interval
- Lower Confidence Limit Standard Deviation Upper
Confidence Limit

Population
Standard
Deviation
Confidence Limit Formulas
Variation
The tensioning device (rubber band) used on the Silobuster has come

under scrutiny
Two sets of tensioners are measured and descriptive statistics are
run. What is the 95% confidence interval for the variation?

Set 1 Set 2
Mean: 0.250 Mean: 0.250
Standard Deviation: Standard Deviation:
0.005 0.005
Sample Size: 25 Sample Size: 100

= 0.0039 and = .0070 = 0.0044 and = .0058

We are 95% confident that We are 95% confident that


the interval 0.0039 to 0.0070 the interval 0.0044 to 0.0058
brackets the true process brackets the true process
standard deviation (0.0031 standard deviation (0.0014
width) width)
TEST OF HYPOTHESIS - roadmap
You want to compare the averages/ medians of
samples of data to decide if they are
statistically different
Are samples normally No Transform
distributed Data
Yes No
Compare median
How many samples do
values instead if
you wan to compare
average
1 3 or
more
One-way ANOVA Kruskall Wallis
1 Sample t- 2
For comparing Test
test averages of three or For samples that do
Comparing av. more samples against not have any
of one sample one another outliners
against target or

Two Sample t- Moods


Paired t-test
test For comparing Median Test
or averages of two For samples that
For comparing
samples that contain have some
averages of two
data that is linked in outliners
samples against each
other pairs
Design of
Experiments
EXERCISE
Represent the following data in
graphical form:

Temperatu
Pressure Response
re
250 275
100
300 285
100
250 270
120
300 325
120
EXERCISE - continued
a) Determine what parameter
settings yield the largest
response.
b) Determine what parameter
settings of pressure would be
bets if it were important to
reduce the variability of the
responses that results from
frequent temperature variations
between two extremes.
EXERCISE - continued
700

600

500
Respons

400
Pressure = 300
e

300 Pressure = 250

200

100

0
100 120
Temperatu
re
Design Of Experiments
Design of Experiments (DOE) is a valuable
tool to optimize product and process
designs, to accelerate the development
cycle, to reduce development costs, to
improve the transition of products from
research and development to
manufacturing and to effectively trouble
shoot manufacturing problems. Today,
Design of Experiments is viewed as a
quality technology to achieve product
excellence at lowest possible overall
cost.
Design of Experiments
General Comments
Keep your experiments simple
Dont try to answer all the questions in one study
Use 2 level designs to start
Try potential business results to the project
The best time to design an experiment is after the
previous one is finished
Always verify results in a follow-up study (
verification)
Be ready for changes
A final report is a must to share the knowledge
Avoid DoE infatuationdo your homework first!
Measure & Analyze to reduce potential variables
Use Graphical Analysis
Use the basic tools of Operational Execllence
Design of Experiments
Be Proactive
DOE is a proactive tool
If DOE output is inconclusive:
You may be working with the wrong variables
Your measurement system may not be capable
The range between high and low levels may be sufficient
There is no such thing as a failed experiment
Something is always learned
New data prompts us to ask new questions and generates
follow-up studies
Remember to keep an open mind
Let the data/output guide your conclusions
Debunk or validate tribal knowledge
Dont let yourself be confused by the facts.
Design Of Experiments
Types of Experiments
Trial and Error Methods
Very Introduce a change and see what

Tradition
Tradition

Approac
Approac
Inform happens
al Running Special Lots or Batches

al
al

h
h
Produced under controlled
conditions
Pilot Runs
Set up to produce a desired effect.
One-Factor-at a-Time Experiments
Vary one factor and keep all other
factors constant
Planned Comparisons of Two to Four
Factors

Approac
Approac
Study separate effects and

Sigma
Sigma
interactions

Six
Six

h
h
Experiment With 5 to 20 Factors
Screening Studies
Comprehensive Experimental Plan With
Very Many Phases
Formal Modeling, multiple factor levels,
Design Of Experiments
Barriers to Successful DoEs
Problem or objective unclear
Results of the experiments unclear
Be present during the DoE
Identify and record unexpected noise or other variables
Measurement Error
Lack of Management Support
Lack of Experimental Discipline
Dont use a DoE as the first pass to identify key Xs
Manage the constants and the noise
Process map, C&E, Constant or Noise or Experimental
Unstable process prior to running DoE
Process map, C&E, Constant or Noise or Experimental,
Manage the Cs and Ns to reduce extraneous variation
Design Of Experiments
Objective
Establish the objective for the experiment
It should be stated in such a way to provide guidance
to those involved in designing the experiment.

What is the purpose of the Experiment?


Design Of Experiments
Planning the Experiment

Team in involvement
Maximize prior knowledge
Pursue measurable objectives
Plan the execution of all phases
Rigorous sample size determination
Allocate sufficient resources for data
collection and analysis.
Design Of Experiments
The following are some of the objectives
of experimentation in an industry:
Improving efficiency or yield
Finding optimum process settings
Locating sources of variables
Correlating process variables with
product characteristics
Comparing different processes,
machines, materials etc.
Designing new processes and products.
Various Terms Used In
Experimentation
Factor:
One of the controlled or uncontrolled variables whose influence
on the response is being studied. May ne variable or
classification data.
Level:
The values or the factor being studied usually high(+) and
low(-)
Treatment Combination:
An experiment run using a set of the specific levels or each
input variable
Response Variable:
The variable that is being studied. Y factor in the study.
Measured output variable.
Interaction:
The combined effect of two or more factors that is observed
which is in addition to the main effect of each factor individually.
Various Terms Used In
Experimentation
Confounding:
One or more effects that can not unambiguously be
attributed to a single factor or interaction.
Main effect:
Change in the average response observed during a change
from one level to another for a single factor.
Replication:
Replication of the entire experiment. Treatment combinations
are not repeated consequently.
Test run:
A single combination of factors that yields one or more
observation of the response.
Treatment:
A single level assigned to a single factor during an
experiment.
Trial And Error
Perhaps the most well known and used methodology.
The objective is to provide a quick fix to a specific
problem.
The quick fix occurs by randomly and no-randomly
making changes to process parameters.
Often changing two or more parameters at the same
time.
The result often is a Band-Aid fix as the symptoms
of the problem are removed, but the cause of the
problem goes undetected.
In trial and error experimentation, knowledge is not
expanded but hindered.
Implement multiple expensive fixes are not
necessary.
One-Factor-At-A-Time
(OFAT)
The old dogma in experimentation is to hold
everything constant and vary only one-factor-
at-a-time.
Assumes any changes in the response would be due only
to the manipulated factor.
But are they?
Is it reasonable to assume that one can hold all variables
constant while manipulating one?
Experience tells us this is virtually impossible.
Imagine there area large number of possible factors
affecting the response variable:
How long would OFAT take to identify critical factors and
where they should be run for best results?
How much confidence would you have that the
knowledge gained would apply in the real world?
OFAT
Although OFAT may simplify the
analysis of results, the experiment
efficiency given up is significant:
Dont know the effects of changing
one factor while other factors are
changing (a reality).
Unnecessary experiments may be
run.
Time to find casual factors (factors
that affect the response) is
significant.
Classification Of Factors
1. Experimental Factors are those which we
really experiment with by varying them at
various levels.
2. Control Factors are those which are kept
at a constant (controlled) level throughout
experimentation.
3. Error or Noise factors are those which
can neither be changed at our will nor can
be fixed at one particular level. Effect of
these factors causes the error component
in the experiment and as such these
factors are termed as error or noise factors.
Experimental Design
Visualization
of The Design (2
Levels 1 factor)
This is often the method used
today for process optimization. It is
the only one factor at a time
concept
High

Factor 1

Low
Experimental Design
Visualization
of The Design (2
Levels 2 factor)
The most basic of true designs.
There are 4 runs.

High

Factor
1

Low
Factor
Low 2 High
Experimental Design
Visualization
of The Design (2
Levels 3 factor)
A little more complicated design
but still very practical. There are
only 8 runs.
High

Factor
1 High
Factor
Low 3
Factor Low
Low 2 High
Experimental Design

Lo Factor Hig
w 4 h

Visualization of The Design (2


Levels 4 factor)
Experimental Design
Visualization of The Design (2 Levels 5 factors)
Here is where its time to stop drawing but it represents the
complexity associated with a 5 factor design.

Factor
Low High
4

Factor
5

Factor
4
Lo
Hig
w
h
Experimental Design
Three Factorial Design, without interaction

Y
- - -
- - +
- + -
- + +
+ - -
+ - +
+ + -
+ + +
Experimental Design
Three Factorial Design, with interaction

Y
- - - + + + -
- - + + - - +
- + - - + - +
- + + - - + -
+ - - - + + +
+ - + - + - -
+ + - + - - -
+ + + + + + +
Design Of Experiments
(DOE)
70
Cancel Amount
60

50

40
CALL DURATION 2
30 CALL DURATION 1

20 Change or address
10

0
CSR A CSR B

(a) Only Interaction, no main


effect
DOE: Continued
70
Change or address
60

50
Cancel Amount
40
CALL DUARTION 1
30 CALL DURATION 2
20

10

0
CSR A CSR B

(b) Only Main Effect, no


Interaction
70
Change or address
60

50
Cancel Amount
40
CALL DURATION 2
30 CALL DURATION 1
20

10

0
CSR A CSR B

(c) Main Effect with Slight


Interaction
Example : Full Factorial Cookie
DOE
Company HillsBerry produces premium cookies.
The company needs to increase throughput in the
bake process by 20% in order to keep up with
demand.
Market research indicates customers require the
cookies have a taste index rating greater than 45.
They are interested in understanding the effects of
two factors, bake time and bake temperature:
1) Bake Time (A)
2) Bake Temperature (B)
The current bake process is 10 minutes at 375 F
per batch.
Full Factorial Cookie DoE
Objective:


Reduce cook time by 20% while maintaining customer
satisfaction for taste.
Response Variable:
Taste score (rated by multiple tasters)
Historical data indicates a score above 45 will delight
the customer
Input Variables:
Bake Time (A).
Bake Time (B).
What questions should be asked?
Construct a factorial design.
How should the experiment be run?
Full Factorial Cookie DoE
DOE Design and Results

Bake
Bake
Time
Temp (F) Taste
(min)
B
A
375 41
6
375 50
10
450 47
6
450 35
10

Questions?
Which Factors are important?
How should the important factors be
set?
Full Factorial
DOE Design and Results
Bake
Bake
Time
Temp (F) Taste
(min)
B
A
375 (-) 50
10 (+)
450 (+) 47
6 (-)
375 (-) 41
6 (+)
450 (+) 35
10 (-)

Sort by taste and look for trends in the factors


Practically speaking, what observations can be made about the
data if:
A change of 10 is required?
A change of 50 is required?

What preliminary actions would you recommend right now?


Full Factorial
DOE Design and Results
To begin analyzing the importance of each
factor, code the levels (+) and (-).

Bake
Bake
Time
Temp (F) Taste
(min)
B
A
375 (-) 50
10 (+)
450 (+) 47
6 (-)
375 (-) 41
6 (+)
450 (+) 35
10 (-)

Using the (+) and (-) designation will be helpful in


understanding how designs are generated and allow
simple analysis of the DOE.
Analyzing the Full Factorial
To calculate the importance of a factor, called

the effect, calculate difference between
the average results at each level of the Bake Bake
variable: Time Time
A B AB Tast
Effect = (Av. at + level) (Av. at level) for
e
each factor
6 (-) 375 + 41
For this problem, (+) - (-) : (-)

(+) = (50+35)/2 = 42.5 10 375 - 50


(+) (-)
(-) = (41+47)/2 = 44.0
6 (-) 450 - 47
(+)
Therefore the main effect of A = 42.5 44 = 10 450 + 35
-1.5 (+) (+)

The main effect of B (temp) is (+) - (-)


(+) = (47+35)/2 = 41
Notice how
the effects
(-) = (41+50)/2 = 45.5
compare to
The main effect of B = 41- 45.5 = -4.5 the graphs
Interactions
Interaction: 2 factors (input variables)
interact if one factors effect on the response Data Set:
is dependent upon the level of other factor. Bake Bake
Time Time
A B AB Tast
For Example: Time and temperature in
e
Baking Cookies
Interaction Plot 6 (-) 375 + 41
55 (-)
Very Good
50 10 375 - 50
5047 (+) (-)
Good
R (taste)

45 6 (-) 450 - 47
Too (+)
41
Raw B (450)
40 Too Done 10 450 + 35
B (375) (+) (+)
35 Strong interactions
35 A (Time)
are indicated by lines
crossing at nearly 90
30 (perpendicular). As
6 10
the lines become
more parallel, the
interactions are
weaker.
Interactions
The (+) & (-) coding for the interaction can
be created by multiplying columns A and B:
Bake Bake
Time Time
A x B = AB A B AB Tast
e
1 - x - = +
6 (-) 375 + 41
2 + x - = - (-)
3 - x + = - 10 375 - 50
(+) (-)
4 + x + = + 6 (-) 450 - 47
(+)
10 450 + 35
Now we can calculate the importance (effect) (+) (+)
of the interaction of time and temperature
on taste for baking cookies:
To determine the interaction effect use the Notice how
+,- column to group the data just like before. the effects
In this case: AB = (41+35)/2 (50+47)/2 compare to
= -10.5 the graphs
Analysis Summary
Now that we have calculated the effect for time, temperature
and the time/temperature interaction we can compare the
relative importance of each: 55 Interaction Plot
Very Good50
Factor Effect 5047
Time 1.5 45 Good

R (taste)
41 B (450)
Temperature 4.5 40
Too B (375)
35
35 Raw
Time x 10.5 Too Done

Temperature 30
6 A (Time) 10

In this experiment, the Time x Temperature Interaction has the


largest effect and is therefore the most important.
Practical Significance:
Controlling Time or Temperature alone will not ensure good
tasting cookies. Time and Temperature should be analyzed
and set together to ensure the best tasting cookies.
Analysis Summary
Interaction Plot

Factor Effect
Time (A) -1.5
Temperature (B) - 4.5 55
Ve ry Go od 50
Time x -10.5 5047

R (ta s te )
45 Goo d

Temperature 41
40
B (450)
Too
Ra w
B (375)
35
35 To o Done

30
6 A (Tim e) 10
Y = Grand average + ((effect A)/2) *A+ ((effect B)/2)*B + ((effect
AB)/2)*AB
Taste = 43.24 ((1.5)/2)*A ((4.5)/2)*B ((10.5)/2)*AB

What is the optimum setting of time and temperature that will yield a
20% improvement in throughput and maintain the taste index above
45?

Time (factor A) needs to be 8 minutes and Y (taste) needs to be >


45
Plug into predictive equation, and solve for temperature
Remember the equation is based on -1 to +1 coded data
Creating The Predictive
Equation
Solve for time reduction at 80% of 10 minutes = 8
minutes
Actual time = (hi + lo)/2 + ((hi lo)/2)*Coded value(A) Determine
Actual time = (10+ 6)/2 + ((10 6)/2)*Coded value(A) the optimal
8 = 8 + 2*(0) coded value settings for
The coded value for 8 minutes is 0
the bake
Y = 45
process
A = 0
Use the predictive equation and solve for B (temperature)
-45 = 43.24 ((1.5)/2)*((4.5)/2)*B ((10.5)/2)*(0)*B
Solve for B
Coded value for B = -.78

Converting back to actual temperature


Actual temp = (hi + lo)/2 + ((hi lo)/2)*Coded Value(B)
Actual temp = (450 + 375)/2 + ((450 375)/2)*(-.78)
Actual temp = 383 F

8 minutes at 383F will achieve the desired throughput and delight


the customers
Example 2:
3 factors at 2 levels (Time-A, Temperature-
B & Pressure-C)
8 runs minimum (2x2x2)
3 factor columns:
# A B C AB AC BC ABC
Complete the
1 - - - + +
3 factor- columns:
interaction
2 + - - - columns
3 - + - - + How would
you calculate
4 + + - + - the
importance
(effect) of
each factor?

5 - - + + -
Full Factorial
The limitations of a full factorial
experiment are not in theory but bin
practically. The resources (i.e. time,
Facto Level Symbo Experiment
cost) necessary to run full factorial rs s l s
designs can be significant.
2 2 4
Full factorials can be used when
investigating a small number of 3 2 8
variables (2-4), but are not 7 2 128
recommended when investigating a
larger number of factors (5 or more) 15 2 32,768
The number of runs needed 2 3 9
increases exponentially with the
number of factors. Therefore, the 3 3 27
time and costs involved in running 7 3 2,187
the experiment become prohibitive.
This does not mean that full
factorials are not useful, but that
they should be used at the proper
DOE
1) Define goals/objectives: 5. Choose appropriate design and
Specific assign factors:
Increase resolution if possible by
Measurable properly assigning factors

2) Determine response Consider where interactive effects will


show up
variables(s): 6. Plan experiment:
Investigate measurement Sampling, replication, repetition
system Data collection methods
Consider multiple responses 7. Perform Experiment:
Note observations during the experiment
3) Select factors: 8. Analyze results: Six Sigma
Basic logical problem solving
rules of Analysis
techniques (Process map, Practical
FMEA, C/E, ranking etc.) Graphical
Consider the strategy Analytical

4) Identify factor levels: 9. Make decisions:


Goals met?
Realistic (would you want to
Plan next experiment?
run at the selected levels?) Management presentation?
Move the response variable
Summary
Learning requires two necessary elements: a
significant event and a perceptive observer.
Learning is accelerated by creating significant
events and observing them. Designed
experiments are most effective.
Learning is an iterative process.
We make hypothesis based on our current knowledge
We test these hypothesis by collecting data and information
We make deductions based on information collected, and form
new hypothesis.
Repeat collecting data until we are confident we have learned
what we need to
Factorial designs help
They are efficient tests
Information from factorials may be combined to get better
results faster
Bench Marking
Benchmarking
Benchmarking is Quality by
Comparison for achieving better
standards. In the global
movement today, the competition
is improving at a faster rate, and
the only way to improve your
relative quality and move upwards
to find out and implement the best
industry practices.
Benchmarking
Benchmarking :- World Class
Stages of ImprovementRecognized as the best
Benchmarked by others
Best in Class
Exceeds customer expectations,
outperforms all competitors and has
clear competitive edge
Efficient
Meets all internal requirements for cost
margins, asset utilization, cycle-time and
measures of excellence
Effective
Satisfies all customer
requirements
Incapable
Is inefficient, ineffective and at the risk of
failing. Needs major redesign
Benchmarking
Methodology
Benchmarking can be of various
types
1. Competitive benchmarking
2. Product benchmarking
3. Process benchmarking
4. Best practices benchmarking

Whatever be the category chosen by the


organization, the benchmarking
methodology remains the same in each case.
A. Identify Processes To
Benchmark
Select Processes to Benchmark
Measure current process
capability and define goal.
Understand detailed process
which needs improvement.
B. Select Organization To
Benchmark
Select organizations which
perform your process.
Compile a list of world class
process parameters.
C. Compile the required
Information
Develop a detailed questionnaire
to obtain detailed information.
Obtain the desired information.
The information can be obtained
from various sources viz:
internet, trade journals,
professional associates, industry
publications, industry experts,
libraries etc.
D. Analyze Gaps
Too much information is as bad as too
little.
Gather and analyze only the information
you need to make a direct comparison of
performance.
Compare like with it.
Identify the performance gaps and
develop an action plan to close the gaps.
Also highlight and quantify the
consequences of not closing the gap.
E. Develop An Action Plan
Review observations of the gap
analysis.
Set new performance standards.
Develop an action plan for
meeting the new performance
standards, identify process
owners with their responsibilities
and move to improvement phase.
Mistake Proofing
Mistake Proofing
Mistake proofing is a scientific
technique for improvement of
operating systems including
materials, machines and methods
with an aim of preventing
problems due to human error. The
term error means a sporadic
deviation from standard
procedures resulting from loss of
memory, perception or motion.
Defect Vs Errors
It is important to understand that
defects and error s are not the
same thing. A defect is the result
of an error, or an error is the cause
of defects as explained below.
Caus Resul
e t
ERROR DEFECT
Prevention of Defects

Caus Intermediate End


e Result Result
Take
Machine or Zero
Detect action
human defe
error feedbac
error ct
k

Stop errors from


turning into
defects
Type of Errors
Error in memory of PLAN: Error of
forgetting the sequence/ contents
operations required or restricted in
standard procedures.
Error in memory of EXECUTION : Errors
of forgetting the sequence/ contents of
operations having been finished.
Error in PERCEPTION of type : Error of
selecting the wrong object in type of
quantity.
Types of Error
Error in MOVEMENT : Error in
misunderstanding/ misjudging the shape,
position, direction or other characteristics
of the objects.

Error in motion of HOLDING : Error in


failing to hold objects.

Error in motion of CHANGING : Errors of


failing to change the shape, position,
direction or other characteristics of object
according to the standard.
Are Errors Unavoidable?
Traditional view: Errors are inevitable
People are only human
There is variation in everything
Lack of standard operating procedures result in
each person having their own way to do things
Inspection is necessary
Six Sigma View: Errors can be eliminated
Not all errors can be eliminated, but many can
and others can be reduced
The more errors we can eliminate, the better our
quality
The need for inspection can be reduced or
eliminated
Basic Functions of Poka
Yoke FUNCTION
STATE
SHUTDOWN Normal functions
stopped when defect
predicted
ABOUT TO Even intentional
CONTROL
OCCUR errors are
impossible
Signals that
WARNING abnormality or errors
DEFECT are about to occur
S
Normal functions
SHUTDOWN stopped when defect
detected
FLOW Defective items
OCCURRED cannot pass on to
CHART
next stage

Signals that
WARNING
defects have
occurred
Principles of Poka Yoke
Respect the intelligence of workers.
Take over repetitive tasks or actions that
depend on constantly being alert
(vigilance) or memory.
Free a workers time and mind to pursue
more creative and value-adding
activities.
It is not acceptable to produce even a
small number of defects or defective
products.
The objective is zero defects.
Ten Types of Human Errors
1. Forgetfulness (not concentrating)
2. Errors due to misunderstanding (jump to
conclusions)
3. Errors in identification (view incorrectlytoo far
away)
4. Errors made by untrained workers
5. Willful errors (ignore rules)
6. Inadvertent errors (distraction, fatigue)
7. Errors due to slowness (delay in judgment)
8. Errors due to lack of standards (written & visual)
9. Surprise errors (machine not capable, malfunctions)
10.Intentional errors (sabotage least common)

Use Mistake Proofing to Eliminate these


Human Errors
Human Error Provoking
Conditions
1. Adjustments
2. Tooling/ tooling change
3. Dimensionality/ specification/ critical condition
4. Many parts/ mixed parts
5. Multiple Steps
6. Infrequent Production
7. Lack of, or ineffective standards
8. Symmetry
9. Asymmetry
10.Rapid Repetition
11.High volume/ extremely high volume
12.Environment conditions
a. Material/process handling
b. Housekeeping
c. Foreign matter
d. Poor lighting
Three levels of Poka Yoke
1. Catching errors before they
create defects,
2. Catching errors during the
process of creating defects,
3. Catching errors that have
created defects and keeping the
defects from further in the
process.
Poka Yoke How to use it?
Step by step process in applying Poka-Yoke
Identify the operation or process based on a Pareto.
Analyze the 5-whys and understand the ways a
process can fail.
Decide the right poka-yoke approach, such as using a
shut out type (preventing an error being made), or an
attention type (highlighting that an error has been
made) poka-yoke.
Take a more comprehensive approach instead of
merely thinking of poka-yoke as limit switches, or
automatic shutoffs. A poka-yoke can be electrical,
mechanical, procedural, visual, human or any other
from that prevents incorrect execution of a process
step.
Where Poka Yoke works
well
Where manual operations and thereby worker
vigilance is needed.
Where mis-positioning can occur.
Where adjustment is required.
Where teams need common-sense tools and not
another buzz-word.
Where SPC is difficult to apply or apparently
ineffective.
Where training cost and employee turnover are high.
Where customers make mistakes and blame the
service provider.
Where special causes can reoccur.
Where external failure costs dramatically exceed
failure costs.
Poka Yoke advantages
No formal training programs
required.
Eliminates many inspection
operations.
Relieves operators from
repetitive tasks.
Promotes creativity and value
adding activities.
Results in defect-free work.
Provides immediate action when
Human Error Provoking
Situations
Inadequate written standards
Too many parts
Mix up
Too many steps
Specifications or critical conditions
Too many adjustments
Tooling change
Frequent change
Environment factors
Principles Of Mistake
Proofing
The principles of mistake proofing
can be categorized into groups.
A. Prevention of occurrence
B. Minimization of effects
A. Prevention Of
Occurrence
Methods under this principle aim to
prevent the occurrence of human errors
from all stages of operations and make
corrections unnecessary. This can be
done through the following 3 methods :-
Elimination
Replacement
Facilitation
Elimination
Elimination method aims to remove the
system properties which generate
operations/restrictions susceptible to
human errors so as to make them
unnecessary. Consider the error of an
operator touching a high temperature pipe
and getting burnt. One method of
preventing this error is to make the pipe
safe by covering it with insulating material.
This improvement removes the property of
the pipe which generates the restriction.
Replacement
The replacement method aims to
use more reliable methods than
operators and make it unnecessary
for operators to perform such
functions as memory, perception
or motion. The use of specially
partitioned parts boxes holding
specified number of parts to
prevent the error of supplying
incorrect number of parts is an
example of foolproof method
Facilitation
The principles of elimination and
replacement described above
make it unnecessary for operators
to perform operations or note
restrictions. In contrast to these,
the purpose of facilitation is to
make such functions as memory,
perception and motion required in
operations easy for the
operators, and thus reduce the
occurrence rate of human errors.
B. Minimization Of Effects*
Methods under this principle aim to
minimize the effects of human error and
focus on processes where the error
develop into serious problems of quality,
safety or efficiency. This can be divided
into two categories :-
1. Detection and
2. Mitigation

*This Principle is also known as Stop-in-


time.
Detection
The concept of detection is that even if a human error
occurs, the deviations from standard states caused by
it can be detected and correlated in the succeeding
operation steps. The methods of correcting the
detected deviations are classified into two types :-
1. Operators are informed or find by themselves the
deviations and take necessary corrective actions.
2. The deviations are automatically corrected
without operators.

There are many methods of informing the operator.


The first method is to bring the operators notice by
alarms or the remained parts and the second method
is by making the successive operations impossible.
Mitigation
The mitigation method aims to make
operations redundant or incorporate shock
absorbing/protecting materials so as to
mitigate the effects of human errors in their
development process.
Example: consider the error of burning out
the motor of a production machine by
forgetting to switch off the power supply.
The mitigation countermeasure is to install a
fuse which cuts off the power supply when
the temperature reaches certain point.
THANK YOU

S-ar putea să vă placă și