Sunteți pe pagina 1din 13

• One-way Anova (Analysis of

Variance) is the procedure • The simplest type of


when only one factor is
Anova is completely
being considered.
randomized one-way
• A factor in Anova describes ANOVA, which involves
the cause of the variation in an independent
the data. random selection of
observations for each
• A level in Anova describes level of one factor.
the number of categories
within the factor of interest.
The following conditions must be present to use
one-way Anova :
1. The population of interest must be normally
distributed.
2. The samples must be independent of each other.
3. Each population must have the same variance.
The hypotheses would look like the following :

H0 : µ1 = µ2 = µ3 (All the population means are equal)


H1 : not all µ’s are equal
If the null hypothesis is rejected, it means that a difference does exist.
Analysis of variance could not compare population means to one another
to determine which is greater. That task requires further analysis.
SSW : the sum of squares within or
SSE : the error sum of squares
SSW :
For example :

S12 = 1,01 S22 = 1,70 S32 = 0,96 Find the grand mean :
n1 = 6 n2 = 6 n3 = 6
X=
SSW =
N = the total number of observations from all samples

= (6 – 1) 1,01+ (6 -1) 1,70 + (6 – 1) 0,96


= 18,35 SSB =

Next can found by : SSB (the sum of SSB = 6(9,12 – 9,83) 2 + 6 (10,92 – 9,83) 2 + 6(9,48 – 9,83)2
squares) = 10,86

SSB =

1= 9,12 2= 10,92 3=9,48


Finally, the total variation of all the observations is known as the total sum
of square (SST) and can be found by :

SST =

SST = SSW + SSB


= 18,35 + 10,86
= 29,21

Note that the variance of the original observations s2, can be determined by :

s2 =

= 1,72
• The mean square between (MSB) is a measure of variation between
the sample means.
• The mean square within (MSW) is a measure of variation within each
sample.
A large MSB variation, relative to the MSW variation, indicates that the
sample means are not very close to one another. This condition will result
in a large value of F, the calculated F-statistic. The larger of value of F, the
more likely it will exceed the critical F-statistic, leading us to conclude
there is a difference between population means.
To test the hypothesis for ANOVA, compare
the calculated test statistic to a critical test For Example :
statistic using the F-distribution. 𝑺𝑺𝑩
MSB =
𝒌 −𝟏
The calculated F-statistic can be found
using the equation : =
𝟏𝟎.𝟖𝟔
𝑴𝑺𝑩 𝟑 −𝟏
 F= 𝑴𝑺𝑩
𝑴𝑺𝑾 = 𝟓. 𝟒𝟑 F=
𝑴𝑺𝑾
Where MSB is the mean square between, 𝟓.𝟒𝟑
found by : 𝑺𝑺𝑾 =
MSW = 𝟏.𝟐𝟐
𝑺𝑺𝑩 𝑵−𝒌
 MSB = = 𝟒. 𝟒𝟓
𝒌 −𝟏
𝟏𝟖.𝟑𝟓
=
And MSW is the mean square within, found 𝟏𝟖 −𝟑
by : = 𝟏. 𝟐𝟐
𝑺𝑺𝑾
 MSW =
𝑵−𝒌
 We use the F-distribution to determine the critical F-statistic, which is
compared to the calculated F-statistic for the ANOVA hypothesis test.
 The critical F-statistic, F𝒂,k-1,N-k , depends on two different degreesbof
freedom, which are determined by : v1 = 𝒌 − 𝟏 and v2 = 𝑵 − 𝒌
For example ;
v1 = 3 - 1 = 2 and v2 = 18 - 3 = 15
The critical F-statistic is read from the F-distribution table.
For v1 = 2 and v2 = 15, the critical F-statistic, F.05,2,15 = 3.682

Because F-calculated (4.45) > F-table (3.682), H0 is rejected.


Therefore, H1 is accepted.
It means that the population means are not equal.
F.05,2,15 = 3.682
The Scheffe test, fs, is as the follows

Fs =

The critcal value for the Scheffe test, Fsc, is determined by multiplying the critical F-
statistic from the ANOVA test by (k -1) as follow the
Fsc = (K – 1) Fa,k-1,N-k

For example :
F.05,215 = 3,2682
Fsc = (3-1)(3,682)
= 7,364

If FS Fsc we conclude there is no difference between the example means, otherwise


there is a difference .
THANKS!

S-ar putea să vă placă și