Explanatory Factor Analysis and Principal Component Analysis
Similarities and differences in the exploratory factor analysis and principal component analysis
Many similarities exist between the explanatory factor analysis and principal component analysis. Both tools assume that the measurement scale is interval or ratio level. Also, the explanatory factor analysis and principal component analysis agree that there is a linear relationship between observed variables and normal distribution for every observed variable. In addition, in the two analyses, every pair of observed variables enjoys a bivariate normal distribution.
However, the analyses vary in many ways. First, the objective of the principal components analysis is to illuminate the majority of the total variance in as much as possible. In contrast, the goal of the explanatory factor analysis is to illuminate the correlations between the variables. Second, in explanatory factor analysis, the original values are described as linear combinations of the aspects, whereas in principal components analysis, the components are computed as linear combinations of the original variables. Third, the principal components analysis is used to decrease the data into the small number of components, whereas the explanatory factor analysis helps to grasp what constructs underpin the data. Fourth, the principal component analysis decomposes the correlation matrix, whereas the explanatory factor analysis decomposes the adjusted correlation matrix. Sixth, the difference between principal component analysis and explanatory factor analysis in mathematical terminologies is discovered in the values that are placed in the diagonal of the correlation matrix. In the principal component analysis, 1.00s are placed in the diagonal, implying that the entire variance in the matrix is to be represented. Hence, it would be important to include the entire variance in the variables. On the contrary, in explanatory factor analysis, the communalities are placed in the diagonal connotation that only the variance shared with other aspects is to be represented. Hence, by definition, it would entail the only variance that is dominant among the factors. Don't use plagiarised sources.Get your custom essay just from $11/page
Importance of Exploratory Factor Analysis and Commonality
Explanatory factor analysis highlights the diversity of data reduction approaches that recognize the common themes in items, including survey items via the application of matrix algebra. They have theoretical assumptions and computational techniques though either will move towards the end of recognizing the best items to evaluate a common underpinning theme, that dominant theme aligns with the underlying theme (Hadi et al., 2016). The particular subject interest is perhaps quite distinct from your own, though the methods and principals of the analytic methods and their applications can be applied to any issue that an individual has a large number of variables to a small set, organized on the grounds of an interpretable dominant theme or themes.
The focus on commonality in the subsets of variables is significant for the overall analysis. Hence, before extraction, the entire variance linked with a variable presumed to be dominant variance as the principal component analysis is centred on the presumption that the whole variance linked with variables alleged to be one before extraction of aspects. Hence, commonality provides information regarding the amount of variance in every item in each illuminated item (Howard, 2016). Low commonality indicates that the item does not fit effectively with another in its component. For instance, 0.383 in the extraction column of the communalities for table 1 implies that 38.3% of the variance associated with this item are dominant. Also, it implies that before extraction, there exist many factors, and thus, the communalities are 1 and the entire variance is illuminated by the factors (Hadi et al., 2016). After the factor extraction, some of the aspects are got rid because some of the information is lost. Hence, the retained aspects after factor rotation cannot illuminate the entire variance highlight the data, and in addition, they can illuminate some. Ultimately, the commonalities epitomize the extent of variance in every variable can be expounded by the retained aspects after extraction.
Results and Findings
Table 1: KMO and Bartlett’s Test | ||
Kaiser-Meyer-Olkin Measure of Sampling Adequacy. | .853 | |
Bartlett’s Test of Sphericity | Approx. Chi-Square | 4751.733 |
df | 435 | |
Sig. | .000 |
The 201 items were subjected to principal component analysis (PCA) using SPSS. Before carrying out PCA, the appropriateness of the data for factor analysis was examined. The sampling is sufficient when the value for KMO test is higher than 0.5. The Kaiser Meyer-Olkin value is 0.853 (see table 1) although the suggested minimum value of 0.5. In SPSS, the strength of correlation can be gauged by a Bartlett Test of Sphericity. The Bartlett’s Test of Sphericity is the statistical significance (see table 1), backing up the factorability of the correlation matrix. The factor scores are correlated, and this implies that we reject the null hypothesis that the previous correlation matrix is considered an identity matrix. The principal component analysis publicized the coexistence of seven aspects (Hadi et al., 2016).
Table 2: Total Variance Explained | ||||||
Component | Initial Eigenvalues | Extraction Sums of Squared Loadings | ||||
Total | % of Variance | Cumulative % | Total | % of Variance | Cumulative % | |
1 | 7.422 | 24.739 | 24.739 | 7.422 | 24.739 | 24.739 |
2 | 4.819 | 16.062 | 40.801 | 4.819 | 16.062 | 40.801 |
3 | 2.356 | 7.854 | 48.655 | 2.356 | 7.854 | 48.655 |
4 | 1.654 | 5.514 | 54.169 | 1.654 | 5.514 | 54.169 |
5 | 1.276 | 4.254 | 58.423 | 1.276 | 4.254 | 58.423 |
6 | 1.161 | 3.870 | 62.293 | 1.161 | 3.870 | 62.293 |
7 | 1.039 | 3.465 | 65.758 | 1.039 | 3.465 | 65.758 |
8 | .937 | 3.125 | 68.883 | |||
9 | .843 | 2.810 | 71.693 | |||
10 | .808 | 2.693 | 74.386 | |||
11 | .785 | 2.617 | 77.003 | |||
12 | .694 | 2.313 | 79.316 | |||
13 | .607 | 2.023 | 81.338 | |||
14 | .568 | 1.895 | 83.233 | |||
15 | .535 | 1.783 | 85.016 | |||
16 | .511 | 1.702 | 86.718 | |||
17 | .461 | 1.535 | 88.253 | |||
18 | .421 | 1.405 | 89.658 | |||
19 | .390 | 1.301 | 90.959 | |||
20 | .367 | 1.224 | 92.183 | |||
21 | .338 | 1.126 | 93.309 | |||
22 | .293 | .978 | 94.287 | |||
23 | .277 | .924 | 95.211 | |||
24 | .258 | .860 | 96.071 | |||
25 | .244 | .814 | 96.885 | |||
26 | .228 | .759 | 97.643 | |||
27 | .211 | .703 | 98.347 | |||
28 | .175 | .583 | 98.930 | |||
29 | .170 | .567 | 99.497 | |||
30 | .151 | .503 | 100.000 | |||
Extraction Method: Principal Component Analysis. |
The total variance of 65.758% is attained from these seven factors. The first Eigen value is equivalent to 7.422 and illuminated 24.739% of the variance in the original data. The second Eigen value is equivalent to 4.819 and expounded 16.062% of the variance. The third Eigen value is equal to 2.356 and 7.854% of the variance. The fourth Eigen value is equal to 1.654 and 5.514 of the variance in the previous data. The fifth value is equivalent to 1.276 and 4.254% of the variance of the original data. The sixth value is equivalent to 1.161 and 3.870% of the total variance. The seventh value is equal to 1.039 and 3.465% of the original data. To understand the Eigen values, a scree plot is used to help the researcher understand data graphically.
Figure 1: Scree Plot
The principal component analysis unveils empirically the commonalities between the personal components and bases the weights of these on the strength of the empirical association between the deprivation measure and the personal potentials (Kong et al., 2017). It is disadvantageous because it implicitly presumes that only elements with strong correlations with one another are relevant for the deprivation measure, which might be debatable in some instances (Jolliffe & Cadima, 2016).
The strengths of explanatory factor analysis are that it is easy to use, essential for many survey questions, and forms the basis for other instruments. The limitations of explanatory factor analysis are that the variables have to be interval-oriented, and the decreasing number should be less than thrice of the quantity of variables (Watkins, 2018).
Conclusion
As expounded, despite the fact that the explanatory factor analysis and principal component analysis are similar, they are different in many aspects. The study carried out a principal component analysis whereby the 201 items. The total variance of 65.758% is achieved for the seven aspects. The scree plot is used to present the data on Eigen values graphically. Evidently, there are strengths and limitations associated with principal component analysis and explanatory factor analysis.
References
Hadi, N. U., Abdullah, N., & Sentosa, I. (2016). An easy approach to exploratory factor analysis: Marketing perspective. Journal of Educational and Social Research, 6(1), 215.
Watkins, M. W. (2018). Exploratory factor analysis: A guide to best practice. Journal of Black Psychology, 44(3), 219-246.
Howard, M. C. (2016). A review of exploratory factor analysis decisions and overview of current practices: What we are doing and how can we improve?. International Journal of Human-Computer Interaction, 32(1), 51-62.
Jolliffe, I. T., & Cadima, J. (2016). Principal component analysis: a review and recent developments. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2065), 20150202.
Kong, X., Hu, C., & Duan, Z. (2017). Generalized principal component analysis. In Principal Component Analysis Networks and Algorithms (pp. 185-233). Springer, Singapore.