1
Overview
An introduction of a procedure for testing the hypothesis that three or more population means are equal. For example: H0: µ1 = µ2 = µ3 = . . . µk H1: At least one mean is different
1
1
Overview
Definition Analysis of Variance (ANOVA) a method of testing the equality of three or more population means by analyzing sample variations
2
ANOVA methods require the F-distribution 1. The F-distribution is not symmetric; it is skewed to the right. 2. The values of F can be 0 or positive, they cannot be negative. 3. There is a different F-distribution for each pair of degrees of freedom for the numerator and denominator. Critical values of F are given in Table D 3
F - distribution Not symmetric (skewed to the right)
α nonnegative values only
4
2 One-Way ANOVA Assumptions 1. The populations have normal distributions. 2. The populations have the same variance σ 2 (or standard deviation σ ). 3. The samples are simple random samples. 4. The samples are independent of each other. 5. The different samples are from populations that are categorized in only one way. 5
Definition Treatment (or factor) a property or characteristic that allows us to distinguish the different populations from another Use computer software for ANOVA calculations if possible
6
Procedure for testing: H0: µ1 = µ2 = µ3 = . . . 1. Calulator to obtain results. 2. Identify the P-value from the display. 3. Form a conclusion based on these criteria: If P-value ≤ α, reject the null hypothesis of
equal means. If P-value > α , fail to reject the null hypothesis of equal means. 7
Relationships Among Components of ANOVA Figure 11-2
8
ANOVA Fundamental Concept Estimate the common value of σ 2 using 1. The variance between samples (also called variation due to treatment) is an estimate of the common population variance σ 2 that is based on the variability among the sample means. 2. The variance within samples (also called variation due to error) is an estimate of the common population variance σ 2 based on the sample variances. 9
ANOVA Fundamental Concept Test Statistic for One-Way ANOVA
F=
variance between samples variance within samples
A excessively large F test statistic is evidence against equal population means.
10
Critical Value of F Right-tailed test Degree of freedom with k samples of the same size n numerator df = k -1 denominator df = k(n -1)
11
Key Components of ANOVA Method SS(total), or total sum of squares, is a measure of the total variation (around x) in all the sample data combined.
12
Key Components of ANOVA Method SS(total), or total sum of squares, is a measure of the total variation (around x) in all the sample data combined.
SS(total) = Σ(x - x)
2
13
Key Components of ANOVA Method SS(treatment or between) is a measure of the variation between the samples. In one-way ANOVA, SS(treatment) is sometimes referred to as SS(factor). Because it is a measure of variability between the sample means, it is also referred to as SS (between groups) or SS (between samples).
14
Key Components of ANOVA Method SS(treatment or between) is a measure of the variation between the samples. In one-way ANOVA, SS(treatment) is sometimes referred to as SS(factor). Because it is a measure of variability between the sample means, it is also referred to as SS (between groups) or SS (between samples).
SS(treatment) = n1(x1 - x)2 + n2(x2 - x)2 + . . . nk(xk - x)2 = Σni(xi - x)2 15
Key Components of ANOVA Method SS(error or within) is a sum of squares representing the variability that is assumed to be common to all the populations being considered.
16
Key Components of ANOVA Method SS(error or within) is a sum of squares representing the variability that is assumed to be common to all the populations being considered. 2
2
2
SS(error) = (n1 -1)s1 + (n2 -1)s2 + (n3 -1)s3 . . . nk(xk -1)si
2
2
= Σ(ni - 1)si
17
Key Components of ANOVA Method
SS(total) = SS(treatment or between) + SS(error or within)
18
Mean Squares (MS) Sum of Squares SS(treatment) and SS(error) divided by corresponding number of degrees of freedom.
MS (treatment or between) is mean square for treatment, obtained as follows:
MS (treatment) =
SS (treatment) k-1
19
Mean Squares (MS) MS (error or within) is mean square for error, obtained as follows:
MS (error) =
SS (error) N-k
20
Mean Squares (MS) MS (error or within) is mean square for error, obtained as follows:
MS (error) =
SS (error)
MS (total) =
SS (total)
N-k
N-1 21