Factor Analysis

  • Uploaded by: SKH
  • 0
  • 0
  • November 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Factor Analysis as PDF for free.

More details

  • Words: 1,096
  • Pages: 6
1 Factor Analysis (Dr See) What are the uses? (Spearman reasoned that if all mental tests were positively correlated, there must be a common variable or factor producing the positive correlations. In 1904 Spearman published a major article about intelligence in which he used a statistical method to show that the positive correlations among mental tests resulted from a common underlying factor. His method eventually developed into a more sophisticated statistical technique known as factor analysis. Using factor analysis, it is possible to identify clusters of tests that measure a common ability). •

Use to reduce a large number of observable instances to a smaller number of unobservable constructs/traits. • To identify a small number of factors to represent relationship among sets of interrelated variables • Mathematical model for factor analysis is as follow: X i  Ai1 F1  Ai 2 F2  ...........  Aik Fk  U i Where F are the common factors, U the unique factor not explained by the common factors, and A are the constants used to combine the k actors How to run factor analysis? 1. compute a correlation matrix – Factor command will create from the raw data a correlation matrix. Variables not related to others can be identified then evaluate the appropriateness of the factor model. Should have large correlation > 0.3 for most cases. Bartlett’s test of sphericity (must be large and sig level small 0.00 therefore it is not an identity matrix otherwise do not use factor analysis) to test hypothesis that the correlation matrix is an identity matrix (all diagonal terms =1 and off-diagonal terms= 0 meaning most variables are not related items). Kaiser-Meyer-Oikin (index for comparing the magnitude of the observed correlation coefficient to partial correlation coefficient) should be > 0.6 for sampling adequacy. If < 0.5 then factor analysis may not be good as correlation between pairs of variables cannot be explained by the other variables. 2. extraction of the factors- similar to multiple regression analysis. First, enter an independent variable (that significantly explains a large amount of variance observed in the dependent variable. Then enter the next indept var that accounts for greatest variation in the dep var, then the next greatest until no more variables that significantly explain the further variance found. Factor analysis select a combination of variables not emphasizing on the dep or indep variables whose correlations explain the greatest amount of total variance for factor 1, then combination of variables that explains the greatest amount of variance that remain for factor 2, then 3rd, 4th. etc

2





every variable is given a communality value of 1.0 initially (0 to 1). This shows the proportion of variance that the factors contributed to explain a particular variable (0 – common factor explain none of the variance in a particular variable, 1 – all variance in that variable is explained by the common factors) after 1st factor extraction, (Factor 1; eigenvalue = 5.13312) largest > 1, this value shows the proportion of variance accounted for by each factor not by each variable, then smaller eigen value in successive factors and cumulative % of variance explained will total 100% even last factor does not explain a sig amount of additional variance.

(VARIANCE: STATISTICS square of standard deviation: a statistical measure of the spread or variation of a group of numbers in a sample, equal to the square of the standard deviation. Other measures are the ratio of the squared standard deviation to the sample size population variance, and the ratio of the squared standard deviation to the sample size minus one sample variance.)

3. Factor selection to create a more understandable factor structure or fewer factors of interest to you. First decide which factors you wish to retain (criteria, they must have face or theoretical validity or eigenvalue > 1 or select from the scree plot) almost impossible to interpret what each factor mean before the rotation.

Select only factors on the steep portion of the graph for rotation or SPSS default select and rotate eigenvalue > 1.

3 •

Rotation: Once factors are selected then rotate them, because it is difficult to interpret the factors without rotation. Reason to rotate is to get a simple structure i.e. some factors have high and some low factor loadings (- 1 to +1 indicate the strength between the var and the factor). Simple structure ideally should have variables load fully on one factor not all other factors.

In reality, data can only come close to the rotated axes from un-rotated one. Rotation will not change the factor structure. By default, SPSS will rotate the variables in Varimax rotation (orthogonal rotation as the axes remain at 90 degrees to each other) You can also use Oblimin or Promax rotation to get a better picturte of the factors. 4. interpret the results. Ideally (rarely happen), each variable will load high (> 0.5 meaning have good face validity and appear to measure some underlying construct) for 1 factor and low (< 0.2) for other factors. Therefore, a variable may load highly on 2 - 3 different factors. So, factor analysis alone rarely produce a clear result, we need other supporting evidence. How to use SPSS to run Factor analysis 1. Click on Analyze then Data Reduction then Factor. This will open the dialog window for factor analysis as follow:

4 Paste the variables you wish to analyze into the active box of variables. For the rotation button select Varimax. Click on the Descriptives button to open this dialog window:

2. If you select Univariate option. It will show 4 columns of (1) var names, (2) means, (3) standard deviations and (4) variable labels. 3. By default initial solution is selected. This will show columns of (1) variable names, (2) initial communalities (1.0), (3) factors, (4) eigenvalues, (5) % , and (6) cumulative % accounted for by each factor. 4. Under the correlation matrix, we have 1. coefficients of the variables 2. sig level for each correlation, 3. determinant of matrix to test for multivariate normality, and 4. KMO and Barlett’s test of sphericity are tests for multivariate normality and sampling adequacy of variables for using factor analysis. How to Read • Examination of the table below shows correlation matrix with most number > 0.3 so it is suitable to use factor analysis.

5

6 For the next table below: Bartlett test of Sphericity is significant and KMO measure of sampling adequacy > 0.6. Only 3 factors will be extracted as their eigenvalues are > 1.0 and they account for 55.3% of total variance.

Related Documents

Factor Analysis
November 2019 20
Factor Analysis
May 2020 13
Factor Analysis
June 2020 18
Factor Analysis
November 2019 34
Factor Analysis
June 2020 9

More Documents from ""

Ant Analysis
November 2019 36
W1 Role Of Ict
November 2019 30
2007-2008 Ps 267 Exam Paper
November 2019 37
Q1
November 2019 42
Ant Validity
November 2019 34